Tom has spent the last two years modeling infectious diseases at Imperial College London. The backbone of this research centered on the messy intersection of science, maths and policy; asking simple questions like: what’s happened and why?, what’s up next?, and how sure are we about that?
Before that, Tom did computational physics and chemistry research at Imperial and UCL. Now he’s excited to optimize the utility-generating economic protocols supporting the transition to web3.
We asked Tom about what brought him to Protocol Labs, the projects he’ll be working on, and his thoughts about future technological developments:
How did you decide to join Protocol Labs, and what are you working on?
It’s cutting-edge work from a technical perspective, and the intellectual diversity of projects is substantial. I was attracted to research with the potential to be very high impact, in the sense that a lot of what’s going on at PL is foundational to how technology in the future will look, how we will access and control information, and how we will fundamentally interact and communicate with each other. Part of my decision to join PL was also based on its principles as a decentralized company: the way it employs researchers and engineers from around the world working according to async principles. I also got good vibes from the people I interviewed with – a sense of optimism and forward-thinking, which is important.
Right now, I’m working on how to determine optimal microeconomic rules to ensure participants in the network are incentivized by the balance of rewards and fees to engage as storage providers – but also sufficiently wary of incurring penalties – for storage to be reliable. Generating insights into this problem relies on a mix of game theory, statistical modeling and data analysis, simulation, toy mathematical models, and thinking about behavioral economics – all fun stuff and challenging for sure.
What research problems are you most interested in exploring?
Ideas come and go, but a persistent one I’ve been considering recently is how to update network design parameters to improve the health stability/growth/utility of the network. The analogy of “tuning the engine of a 747 while it’s flying” springs to mind. Every change that’s made to a parameter deep in the protocol risks stalling the engine. But there’s also a non-negligible gain to be made from improving performance that can’t be dismissed. So this is a classic high-stakes exploration/exploitation tradeoff. My instinct is to appeal to Thompson sampling, perhaps with a human governance layer, but there are many aspects to think about: for example, simply defining what the time-dependent model of utility to optimize should look like is non-trivial.
What future technology are you most excited about?
Compute-Over-Data with FVM is a nascent revolution in waiting, and I think building Layer 2 soluions on top of Filecoin also has a lot of potential. As for tech in general: there’s research and building going on to combine earth observation data and machine learning; it will be fascinating to see how it helps optimize wind and solar distribution, for example, to reduce spinning reserves. I’m also very excited about the tech that has the potential to give us all the Keynesian good life at a 15-hour working week!