Nvidia’s Chips and the InternetWatching the rise of Nvidia’s GPUs has been amazing. I remember early in the PC era during the 90s, GPUs were the rage for gaming. At the time, personal computers were still the most popular way to connect online. Smartphones didn’t exist and laptops weren’t lightweight yet. In fact, you needed GPUs simply to power video game graphics on your computer. Online streaming wasn't popular yet so you couldn’t offload processing power to different servers. As cyclical as technology may be, there are several companies that have stood the test of time. Microsoft, IBM and Apple might be the most popular names but they aren’t the only ones. Nvidia was founded in 1993, right before the first internet cycle had begun. The internet started gaining traction in 1995, and in 1996 PC gaming caught its first wave with the releases of Quake and Duke Nukem. I was too young at the time but remember popularity as it spilled over into the console era. The Nintendo 64 was the first console to deliver 3d graphics in a real way. Consoles were great for multiplayer, but I, like many other tech hobbyists, preferred the PC. But PC gaming could get expensive very fast. There were many more moving parts and new games required constant updates. Plus the 90s was when the internet was just taking off. Households were only familiar with paying for their monthly phone bills. Dial-up was the preferred way of connecting to the internet. When broadband became mainstream, everyone’s monthly internet started to climb. I don’t think this cost plateaued until we have 4G/5G networks for mobile phones. It wasn’t until the 2010s, when data became more abundant and cheaper than water. Data Center Energy ProblemAs the internet became mainstream, online gaming became much better. I remember video games like Counter-Strike, Half-Life and World of Warcraft climbing the charts. I wasn’t much of a gamer, particularly because of the need to upgrade PC games. Nvidia GPUs had become so expensive that I couldn’t even afford to play the Sims. The Intel CPUs weren’t cutting it anymore on my tiny HP PC. Which is why this recent GPU cycle has been so interesting for me to observe. Today I use Nvidia GPUs 4-5x per at minimum. They power every single LLM & A.I. tool on my computer. In the early 2000s, I couldn’t imagine that the cost of compute would literally cost pennies. To achieve that much scale is an exceptional task for all these cloud computing companies. That’s why the growth of Nvidia Compute & Networking Revenue has surpassed its Graphics Revenue in a significant way over the past few years. However, I wouldn’t have suspected that energy consumption would be the limiting factor for artificial intelligence today. You see, we didn’t have this problem with broadband because the hard cost was installing internet cables. Once the infrastructure was laid down, the fixed costs to get online would become cheaper over time. But that has not been the case with A.I. The problem with A.I. today is the balance between utility and consumption. Right now we are seeing spikes in usage across consumers and enterprises for compute. The future is here but it is not evenly distributed. So even if A.I. becomes a no-brainer investment for the major investment, they can’t deploy more than several billion dollars per year. Nvidia being one of the exceptions because they are supplying the entire data center industry. That’s why Nvidia has committed to investing $500 billion in America to onshore production over the next four years. But with every stage of A.I., there will be a greater need for energy consumption. Combined with the advent robots and electric vehicles, a majority of real-world devices will require electricity to operate. Data centers and A.I. will be competing for the same resource. Now the problem with energy is it can become expensive quite fast, and wont have the same investment returns as building software companies. In fact, energy can become commoditized more rapidly. But both technology and energy will depreciate quite fast. A.I. has a very short life, given how competitive the marketplace is to build better models. At the moment, I think the energy problem will be fine for 2-3 years. Right now the market can handle the demand for more intelligence and balance the load. However, this problem is compounding at a rapid rate. Similar to the energy blackouts we’ve in California, we can face similar bottleneck challenges but with rising data center demand. I don’t know how this will play out but I can foresee the cost of compute will rise.
|
Learn about the latest technology investments here.
Nasdaq-100 Daily Newsletter Thursday, April 24, 2025 Nasdaq-100 Daily Market Update The tech-heavy index is currently trading at 18,693.26, representing a 2.28% change. Market Snapshot Index Price Change NASDAQ 100 18,693.26 +2.28% S&P 500 5,375.86 +1.67% Dow Jones Industrial Average 39,606.57 +1.07% NASDAQ Composite 16,708.05 +2.50% CBOE Volatility Index 28.32 -0.46% Top Gainers Symbol Name Price Change Volume TSLA Tesla, Inc. 250.74 +5.37% 147.94M AVGO Broadcom Inc. 176.91 +4.32% 24.32M...
The A.I. Wars Have Begun For the past two years, I’ve spent a lot of time testing the latest A.I. tools. Personally I have always tested the latest hardware and software, but A.I. is the first time I get into the details to improve my everyday life. Using A.I. tools like ChatGPT for the first time feel like turning on my original PC or accessing the internet through Netscape Navigator. There is going back once you see the light. At first, OpenAI had a significant lead when ChatGPT was...
My Experience Working on Wall Street from 2007 After I graduated high school, I started interning on Wall Street during the Summer of 2007.Every time I look back at the Great Recession to study market trends, I always find myself back at this starting point.The Fall of 2007 was probably peak stock market euphoria. I didn't know much about investing yet and had just opened a Scottrade account. Most of my investment knowledge came from studying finance & economics. I had zero real-world...