1.726642
Relative Brier Score
10
Forecasts
0
Upvotes
Forecasting Calendar
Past Week | Past Month | Past Year | This Season | All Time | |
---|---|---|---|---|---|
Forecasts | 1 | 2 | 14 | 14 | 24 |
Comments | 1 | 1 | 9 | 9 | 12 |
Questions Forecasted | 1 | 1 | 7 | 7 | 14 |
Upvotes on Comments By This User | 0 | 0 | 5 | 5 | 5 |
Definitions |
Active Forecaster
Active Forecaster
I did some digging on internet re question if Nvidia can meet demand. It seems to me that they very likely already have the resources to meet even very high demand very fast by changing part of their current gaming GPU fabbing slots to datacenter gpu slots, as they use same TSMC technology. It seems to me they have done this arbitrage also in the past with the cryptomining boom.
The open question that remains is how much demand will there be for the GPUs, but it seems to me coming 12 months might be quite high in demand given now also governments have started to make random billion dollar projects and purchase orders for the GPUs because of cloud sovereignty. Commercially one still would presume future and current revenue of next year is still lower than the maximum price of capital, ie the datacenter GPUs, that bring the revenue, so for revenue of 1 G$ next year one might expect companies willing to make current investment purchases of maybe at least 10 G$ now. So GPU sales should be maybe at least 1 year before meaningful revenues of AI use start to come in, and this should lead to revenue structure where first Nvidia and datacenter constructors gain revenue, and then later after that AI operators gain revenue from customers. So for each dollar of AI revenue next year, Nvidia likely should earn more than 1 dollar this year.
But, on supply: apparently Nvidia shipped already in 2023 3.7 million GPU units for data centers, which is a higher number than their previous plans of 2 million h100s for 2024. I guess many of last year's might have been something else than hoppers.
Given Hopper uses same process than 40-series of consumer GPUs, and given there has been 24 million such sold in 2022; and given hopper seems to have 4/3 die size compared to 4090 and blackwell twice of that; it seems to me it is plausible Nvidia can make up to 18 million hoppers per year and up to 9 million blackwells, if they want to cannibalize all of their consumer GPUs.
Given h100 costs 30000$, last year's 3.7M units in hoppers would be about 100 G$, presuming no growth from last year. Earnings of 22.5 G$ for last quarter would be about in line with that.
Sources:
https://www.hpcwire.com/2024/06/10/nvidia-shipped-3-76-million-data-center-gpus-in-2023-according-to-study/ 2024-07-10
https://www.tomshardware.com/news/nvidia-to-reportedly-triple-output-of-compute-gpus-in-2024-up-to-2-million-h100s 2023-08-24
https://www.datacenterdynamics.com/en/news/nvidia-increases-blackwell-orders-from-tsmc-by-25-percent-18m-gb200-nvl36-server-cabinet-expected-to-account-for-bulk-of-deliveries/ 2024-07-16
https://money.udn.com/money/story/5612/8094994?from=edn_maintab_index 2024-07-15
https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889
https://en.wikipedia.org/wiki/Blackwell_(microarchitecture)
https://wccftech.com/nvidia-to-dominate-data-center-share-in-2024-46-billion-usd-revenue-expected/ 2024-02-01
https://www.tomshardware.com/video-games/pc-gaming/steam-survey-suggests-more-people-bought-the-rtx-4090-than-the-steam-deck-along-with-millions-of-other-rtx-40-series-gpus 2024-01-02
https://www.tomshardware.com/news/nvidia-maintains-lead-as-sales-of-graphics-cards-hit-all-time-low-in-2022-jpr 2023-03-04
https://www.trendforce.com/news/2024/07/02/news-nvidias-h200-order-delivered-from-q3-boosting-server-supply-chain-with-strong-demand/ 2024-07-02
https://longportapp.com/en/news/206075733 2024-06-12
Why do you think you're right?
It seems to me datacenter GPU demand may indeed keep increasing. If no AI winter comes, Nvidia should be able to keep up with the demand by cannibalizing its gaming GPU production, perhaps at least 1 year from now. In addition to cloud operators it seems governments are now waking to needing to have sovereign clouds, so governments are starting to spend discretionary billions worth on someone near them buying GPUs.
For full disclosure I have a personal investment stake on the matter pro growth, although I hope for computation caps for personal reasons.
Why might you be wrong?
Sudden AI winter might come or Taiwan might get invaded.
Active Forecaster
Why do you think you're right?
They have been reducing their footprint heavily.
Why might you be wrong?
Reductions are not closing, so unsure if the definitions of the question will be crossed
keeping current forecast. A friend guessed from some tea leaves that share price might be around 180$ around the judgement day, based on current projections on Blackwell sales. There seems to be little to gain currently from options market though given the general optimism level. I guess I expect it will still dip a couple of times.
Again, geopolitical risk seems rather relevant. A lot will depend on what the new presidential administration will decide on policies, I think.
There are plenty of places to build datacenters to, electricity seems a big bottleneck. Currently 1 W of datacenter GPUs seems to cost about 10$ for investment costs to GPUs if I recall correctly. Therefore if there was 10 GW new GPU datacenter construction in a year, that would already mean 100 G$ of GPU sales for the same. Given how far AMD is behind, it seems likely Nvidia will manage to keep 90% market share. Other competitors seem so far away that they are unlikely to affect Nvidia's position before the judgement of the prediction. Apple is relevant because it competes for same fab capacity, but it seems to me Apple is not making M1 datacenters, so it does not compete in the actual business area, just on the capacity side. This means that 90% of the datacenter GPU investment costs will be revenue of Nvidia. The basic question is how many gigawatts of GPU datacenter investments will start before the judgement day.