Preface
Artificial intelligence is set to redefine how we work, operate, and go about our daily lives. The previous statement is now taken as fact. The details however are a bit more blurred. The specifics as to how it concretely relates to crypto and digital assets are even more opaque. Much has been said about the need to regulate AI to ensure it can safely interact with the general public. Crypto (or rather credibly neutral networks) have been touted as one of the possible solutions to keep LLMs “honest and fair”. While it is impossible to say with certainty today if AI as a whole needs crypto, there are indeed clear areas of overlap in which the two industries can meet and strive to solve one another’s challenges.
The purpose of this report is to succinctly identify major areas of the AI stack which are prone to disruption and improvement by decentralized networks.

Executive Summary
Hardware represents the most capital intensive segment of the supply chain with the highest barriers to entry
AI Middleware is the largest opportunity for crypto-based applications to control a piece of the AI tech stack
Decentralized networks may be a novel home for AI applications and AI front-ends though this remains unclear
Hardware
The AI stack begins with hardware. There is no denying that there is massive pent-up demand for GPUs. Taking Nvidia’s revenue from Q4/22 to Q4/23 as a proxy for GPU demand, we deduce that GPU demand has increased by 405%. Much of that demand has still not yet been met, with tech giants spending tens of billions of dollars to acquire the most sought-after pieces of hardware. This trend is set to continue for at least a few years. For AIs to be present in the way many of us imagine, perhaps akin to the smartphone in your pocket, this supply/demand mismatch will need to be solved. Given GPU hardware supply bottlenecks will likely persist, novel software processes would need to be put in place to reduce the training and inference compute budget 10,000x to 1,000,000x while keeping model sizes constant. Today, this level of efficiency seems out of reach.
One of the questions we ask ourselves is whether a decentralized organization could build the next breakthrough piece of hardware to power and refine foundational AI models? This seems unlikely for a few reasons. Nvidia has a huge first mover advantage here (note their +$1.2T market cap increase since December 2023), ARM and AMD are competing fiercely, and the usual FAANG suspects (Google, Amazon, Microsoft) have been experimenting with custom AI chips for some time now. Even Tether (the issuer of USDT) is throwing their hat in the ring. Suffice to say, highly performant GPUs will remain in the hands of centralized stakeholders for the foreseeable future.
Next comes the ability to operate these assets efficiently. The Tier 3 and Tier 4 datacenter facilities (which operate the highest quality assets) where H100s are typically found, have additional niche advantages over other operators. These requirements to maximize hardware efficiency are basic necessities as compute continues to evolve. In the short term, we see distributed compute and model training solutions as a chore, not a luxury. This may evolve in time but the problem should be viewed from the perspective of a small and medium-sized datacenter where excess compute may be sourced, not as a way to displace AWS or Google Cloud.
Finally, we will continue to see smarter mobile phones as on-device processing is improving. With current cloud costs, it would likely benefit device manufactures to substitute recurring cloud charges with more one-time costs to the device.
Decentralized hardware applications may find a niche in commodity hardware to specifically train models after a foundational model has been created. Access to less powerful hardware is currently available in a decentralized manner and absent the need to utilize a 50B+ parameter dataset, piecing together compute could be a sustainable solution for certain niche products.
In summary, the hardware space will remain constrained. Sam Altman looking to raise amounts that would make Softbank blush, should remind everyone that capital is king. We reserve some possibility for a legacy player to try and maintain relevance or potentially a totally new, capital light player, however, these are all very long shots.
Middleware
In AI, middleware links AI systems and models with end-user software applications and services, ensuring the seamless integration and communication between AI capabilities and existing IT front-ends. We believe that this area of the AI supply chain has the highest potential for enhancement by crypto. Currently, this is a relatively undecided battleground where Big Tech, legacy Web2 players and novel Web3 protocols compete with AI agents and linked service offerings.
Hardware Focused Middleware (20%)
As discussed, hardware will remain a prized possession. The deepest pockets on earth are looking to harness this resource and are willing to pay a premium. This means two things:
It is absolutely imperative that this hardware consistently operates at an exceptionally high level
The marginal value of these assets are higher to certain parties (read Meta, Google, etc.) than others
Most of the players in this segment will be the ones with access to the aforementioned hardware. Renting out your GPUs to anyone other than the highest bidder does not make economic sense and Meta is publicly willing to pay more per hour for GPUs than anyone else. For now, and likely in the near future, the majority of these products will be bespoke solutions because they are required to be for performance reasons. When we hit a level of diminishing returns, or as GPU availability increases, the application set should expand but not materially so.
Big Tech solutions will likely overlay the hardware they own e.g. AWS will have a specific solution for AWS utilized hardware. Web 2 and traditional businesses will look to play a large part in this space yet they will likely not have huge successes. For these challengers, changing the organizational ethos of any 20+ year old company is hard, because it is so fast moving and nascent, this space is simply better suited for lightweight and agile businesses. For crypto-based businesses and protocols, they need to offer a specialty product to a specialty user. Otherwise they will largely lose out on the battle of bundling this hardware-software package. Broadly speaking, the ability to identify which moats are real and which are not will be the toughest obstacle for crypto investors in this space.
Pureplay Middleware (30%)
Since we expect the hardware layer to become concentrated, the necessary glue will be the middleware layer. We also expect this segment to have a more credible motive for decentralization compared to hardware focused middleware. That being said, we believe that the majority will remain centralized. Light and agile organizations will be able to move faster and gain market share, meaning that ultra-high margin protocols will consistently have an advantage against Big Tech and certainly against legacy Web2 players. It is unclear to us what exactly this middleware looks like today. It could take the form of a “cheap to transact blockchain” or a “Zapier for AI” home to all artificial intelligence plugins anyone could desire. It may end up as both to some degree.
Application Focused (50%)
This is what will power the proverbial App Store for AI, thus becoming the most competitive market in the industry. Through application-focused middleware, specific applications and modules could become simple plug-and-play modules where foundational models will remain a necessary evil at the base layer (not captured here). From here they will be used for specific use cases refining either the model or the data level through processes like RAG (Retrieval Augmented Generation). This part of the stack, we believe, will be the most obvious home for decentralized applications. The key element of these (decentralized) applications is that they will be the most plug-and-play and composable versus their peers. Crypto and Web3 apps have the ability to truly excel in this area and we expect them to take significant market share from their Big Tech and Legacy Tech competitors. The key here stems more from the fact that everything is open-source, composable, and interconnected more so than from decentralization itself. As the “customer” increases their time online even further this design space becomes not only increasingly more interesting but cumulatively builds upon itself versus restarting from scratch like many of its competitors. One can think of the application specific middleware as iPhones that enabled the use case of Uber.
Applications and front-ends
“AI for X” (no reference to the application formerly known as Twitter) seems to be the busiest narrative in the market at the moment. Some applications are solving problems and are actually useful, others portend to be (and solve nothing). These will be forgotten. This phenomenon is occurring due to the market chasing narratives and buzzwords. In reality, any application can power its backend with some sort of AI and claim to be revolutionary. Sooner or later however, applications will need to demonstrate their value to justify their existence.
Pre-loaded consumer apps and broadly used enterprise applications are all experimenting with AI today. Big Tech and Legacy Web2 companies have a significant distribution advantage for many broad-based solutions through channels that are already in place. They also have the highest cost sensitivity. if these apps don’t generate more dollars than they consume quarterly, expect them to be cut quickly (remember AI is very expensive today). Crypto-based solutions have a fairly large opportunity set here. Deploying an application quickly and to a global audience will always provide structural advantages. Low organizational costs and the ability to leverage global talent add to this but will not win the day independently of other factors. Similar to the applications we are familiar with today, most will fail.
Data
Notably missing from this equation is a “data” segment. What do we do with all of our data? Web3 is an easy solution but it is certainly a bit more complex than “own your own data” in reality. A solution needs to be addressed at a global scale to solve this issue. This requires one to first be proposed, then agreed upon, and then adopted. We have very low confidence we will actually get there in any sort of realistic time frame. A global standard for data is a political problem and a public sector issue versus a private sector one. This means a totally new data regime will need to be architected and this is a problem that takes years and decades to solve rather than weeks and quarters. If Spotify artists are any indication, I don’t think it’s possible for 99.9% of the world’s population to generate much of anything from their data (in isolation), making real monetization challenging. For professional data providers, creators, and curators (NYTimes or Getty Images as examples) there is a chicken and an egg issue: data provenance is a real problem. Once we reach 100 billion parameters, who owns what exactly? There is no practical way to search this number of datapoints without some commercial overlap. If instead you ask for forgiveness versus permission after harvesting said data, the deepest pockets that can withstand the most aggressive lawsuits will be the winners.
100x Innovation
The middleware segment is the most obvious choice today for innovation. This is the new sandbox, this is where the next trillion dollars in technological progress will come from. It will be fostered in communities like Morpheus building decentralized AI agents to compete head-to-head with Google. It will be found in the connective tissue of hardware marketplaces like Marlin and Phala who enable real AI connectivity today. This design space is evolving, and quickly so, but it is exactly where the next 100x in innovation will come from.
The Unanswerable
Anyone who claims to have all the answers when it comes to these topics is either lying or incorrect. In such a frontier subject, all we can do is make highly informed guesses. To that end, posing these final questions opens up very interesting commercial choices and potentially new industries, so considering them is well worth our time.
What latency is required for running the processes we need? Commodity hardware could easily be a solution for AI but not low latency AI; it’s a business optimization case where you trade speed for cost. Where that balance lands as AIs continue to permeate is an open question
When do we reach an efficient frontier in terms of models? There is probably some level of diminishing returns once we get to 100 billion or 200 billion parameters, what represents “good enough”?
When does the business case support decentralization versus the ideological case? While this is likely an individual company decision, today there are opportunities that indeed make sense
About Florin Digital
Florin Digital is a crypto-asset investment firm deploying at the intersection of finance and technology.
We seek out early-stage opportunities in liquid markets which present an asymmetric risk-reward profile. Our thesis-driven and hands-on approach allows us to unlock significant latent value in the teams and protocols that we partner with.
We aim to capture superior risk-adjusted returns across market cycles by investing in opportunities across DeFi, digital infrastructure, and emerging ecosystems. We have a team of globally distributed investment professionals with a proven track record in early-stage markets.
For more information, please visit www.florindigital.io