火火兔爸
10-30
苹果客户那么分散也没见降低风险,创新能力才是根本
46% of Nvidia's Revenue Came From 4 Mystery Customers Last Quarter
免责声明:上述内容仅代表发帖人个人观点,不构成本平台的任何投资建议。
分享至
微信
复制链接
精彩评论
我们需要你的真知灼见来填补这片空白
打开APP,发表看法
APP内打开
发表看法
{"i18n":{"language":"zh_CN"},"detailType":1,"isChannel":false,"data":{"magic":2,"id":365743577923776,"tweetId":"365743577923776","gmtCreate":1730293425966,"gmtModify":1730293430264,"author":{"id":3546249304196282,"idStr":"3546249304196282","authorId":3546249304196282,"authorIdStr":"3546249304196282","name":"火火兔爸","avatar":"https://static.tigerbbs.com/1914631c23e50a7645217a36cc9d9598","vip":1,"userType":1,"introduction":"","boolIsFan":false,"boolIsHead":false,"crmLevel":9,"crmLevelSwitch":1,"individualDisplayBadges":[],"fanSize":289,"starInvestorFlag":false},"themes":[],"images":[],"coverImages":[],"html":"<html><head></head><body><p>苹果客户那么分散也没见降低风险,创新能力才是根本</p></body></html>","htmlText":"<html><head></head><body><p>苹果客户那么分散也没见降低风险,创新能力才是根本</p></body></html>","text":"苹果客户那么分散也没见降低风险,创新能力才是根本","highlighted":1,"essential":1,"paper":1,"likeSize":0,"commentSize":0,"repostSize":0,"favoriteSize":0,"link":"https://laohu8.com/post/365743577923776","repostId":2479222466,"repostType":2,"repost":{"id":"2479222466","pubTimestamp":1730278620,"share":"https://www.laohu8.com/m/news/2479222466?lang=&edition=full","pubTime":"2024-10-30 16:57","market":"fut","language":"en","title":"46% of Nvidia's Revenue Came From 4 Mystery Customers Last Quarter","url":"https://stock-news.laohu8.com/highlight/detail?id=2479222466","media":"Motley Fool","summary":"Nvidia's incredible growth is increasingly reliant on just a handful of customers.","content":"<html><body><ul>\n<li>\n<div>\n<svg fill=\"none\" height=\"15\" viewbox=\"0 0 14 15\" width=\"14\" xmlns=\"http://www.w3.org/2000/svg\">\n<path d=\"M14 5.58984C14 2.91016 11.8398 0.75 9.16016 0.75C6.50781 0.777344 4.375 2.91016 4.375 5.5625C4.375 6.10938 4.45703 6.60156 4.59375 7.09375L0.191406 11.4961C0.0546875 11.6328 0 11.7969 0 11.9609V14.0938C0 14.4766 0.273438 14.75 0.65625 14.75H3.71875C4.07422 14.75 4.375 14.4766 4.375 14.0938V13H5.46875C5.82422 13 6.125 12.7266 6.125 12.3438V11.25H7.13672C7.30078 11.25 7.51953 11.168 7.62891 11.0312L8.28516 10.293C8.55859 10.3477 8.85938 10.375 9.1875 10.375C11.8398 10.375 14 8.24219 14 5.58984ZM9.1875 4.25C9.1875 3.53906 9.76172 2.9375 10.5 2.9375C11.2109 2.9375 11.8125 3.53906 11.8125 4.25C11.8125 4.98828 11.2109 5.5625 10.5 5.5625C9.76172 5.5625 9.1875 4.98828 9.1875 4.25Z\" fill=\"#FFB81C\"></path>\n</svg>\n</div>\n<div>Nvidia's market cap is up almost tenfold since the start of 2023 thanks to demand for its data center GPUs.</div>\n</li>\n<li>\n<div>\n<svg fill=\"none\" height=\"15\" viewbox=\"0 0 14 15\" width=\"14\" xmlns=\"http://www.w3.org/2000/svg\">\n<path d=\"M14 5.58984C14 2.91016 11.8398 0.75 9.16016 0.75C6.50781 0.777344 4.375 2.91016 4.375 5.5625C4.375 6.10938 4.45703 6.60156 4.59375 7.09375L0.191406 11.4961C0.0546875 11.6328 0 11.7969 0 11.9609V14.0938C0 14.4766 0.273438 14.75 0.65625 14.75H3.71875C4.07422 14.75 4.375 14.4766 4.375 14.0938V13H5.46875C5.82422 13 6.125 12.7266 6.125 12.3438V11.25H7.13672C7.30078 11.25 7.51953 11.168 7.62891 11.0312L8.28516 10.293C8.55859 10.3477 8.85938 10.375 9.1875 10.375C11.8398 10.375 14 8.24219 14 5.58984ZM9.1875 4.25C9.1875 3.53906 9.76172 2.9375 10.5 2.9375C11.2109 2.9375 11.8125 3.53906 11.8125 4.25C11.8125 4.98828 11.2109 5.5625 10.5 5.5625C9.76172 5.5625 9.1875 4.98828 9.1875 4.25Z\" fill=\"#FFB81C\"></path>\n</svg>\n</div>\n<div>Nvidia's GPUs are the primary choice for developing artificial intelligence models. </div>\n</li>\n<li>\n<div>\n<svg fill=\"none\" height=\"15\" viewbox=\"0 0 14 15\" width=\"14\" xmlns=\"http://www.w3.org/2000/svg\">\n<path d=\"M14 5.58984C14 2.91016 11.8398 0.75 9.16016 0.75C6.50781 0.777344 4.375 2.91016 4.375 5.5625C4.375 6.10938 4.45703 6.60156 4.59375 7.09375L0.191406 11.4961C0.0546875 11.6328 0 11.7969 0 11.9609V14.0938C0 14.4766 0.273438 14.75 0.65625 14.75H3.71875C4.07422 14.75 4.375 14.4766 4.375 14.0938V13H5.46875C5.82422 13 6.125 12.7266 6.125 12.3438V11.25H7.13672C7.30078 11.25 7.51953 11.168 7.62891 11.0312L8.28516 10.293C8.55859 10.3477 8.85938 10.375 9.1875 10.375C11.8398 10.375 14 8.24219 14 5.58984ZM9.1875 4.25C9.1875 3.53906 9.76172 2.9375 10.5 2.9375C11.2109 2.9375 11.8125 3.53906 11.8125 4.25C11.8125 4.98828 11.2109 5.5625 10.5 5.5625C9.76172 5.5625 9.1875 4.98828 9.1875 4.25Z\" fill=\"#FFB81C\"></path>\n</svg>\n</div>\n<div>The company's sales are increasingly concentrated, which could pose a big risk to the company in the future. </div>\n</li>\n</ul><div><p><strong>Nvidia</strong> <span>(NVDA<span> 0.52%</span>)</span> had a market capitalization of $360 billion at the start of 2023. Less than two years later, it's now worth over $3.4 trillion. Although the company supplies graphics processing units (GPUs) for personal computers and even cars, the data center segment has been the primary source of its growth over that period.</p><p>Nvidia's data center GPUs are the most powerful in the industry for developing and deploying artificial intelligence (AI) models. The company is struggling to keep up with demand from AI start-ups and the world's largest technology giants. While that's a great thing, there is a potential risk beneath the surface.</p><p>Nvidia's financial results for its fiscal 2025 second quarter (which ended on July 28) showed that the company increasingly relies on a small handful of customers to generate sales. Here's why that could lead to vulnerabilities in the future.</p><div><app :collapse_on_load=\"false\" :instrument_id=\"204770\" :show_benchmark_compare=\"true\" amount_change=\"0.73\" average_volume=\"315,651,278\" company_name=\"Nvidia\" current_price=\"141.25\" daily_high=\"142.26\" daily_low=\"138.90\" default_period=\"FiveYear\" dividend_yield=\"0.02%\" exchange=\"NASDAQ\" fifty_two_week_high=\"144.42\" fifty_two_week_low=\"40.87\" gross_margin=\"75.98\" logo=\"https://g.foolcdn.com/art/companylogos/mark/NVDA.png\" market_cap=\"$3,465B\" pe_ratio=\"66.32\" percent_change=\"0.52\" symbol=\"NVDA\" volume=\"317,756\"></app></div><h2>GPU ownership is a rich company's game</h2><p>According to a study by McKinsey and Company, 72% of organizations worldwide are using AI in at least one business function. That number continues to grow, but most companies don't have the financial resources (or the expertise) to build their own AI infrastructure. After all, one of Nvidia's leading GPUs can cost up to $40,000, and it often takes <em>thousands</em> of them to train an AI model.</p><p>Instead, tech giants like <strong>Microsoft</strong> <span>(MSFT<span> 1.26%</span>)</span>, <strong>Amazon</strong> <span>(AMZN<span> 1.29%</span>)</span>, and <strong>Alphabet</strong> <span>(GOOG<span> 1.66%</span>)</span><span> (GOOGL<span> 1.77%</span>)</span> buy hundreds of thousands of GPUs and cluster them inside centralized data centers. Businesses can rent that computing capacity to deploy AI into their operations for a fraction of the cost of building their own infrastructure.</p><div><div><div></div></div></div><p>Cloud companies like <strong>DigitalOcean</strong> are now making AI accessible to even the smallest businesses using that same strategy. DigitalOcean allows developers to access clusters of between one and eight Nvidia H100 GPUs, enough for very basic AI workloads.</p><p>Affordability is improving. Nvidia's new Blackwell-based GB200 GPU systems are capable of performing AI inference at 30 times the pace of the older H100 systems. Each individual GB200 GPU is expected to sell for between $30,000 and $40,000, which is roughly the same price as the H100 when it was first released, so Blackwell offers an incredible improvement in cost efficiency. </p><p>That means the most advanced, trillion-parameter large language models (LLMs) -- which have previously only been developed by well-resourced tech giants and leading AI start-ups like OpenAI and Anthropic -- will be financially accessible to a broader number of developers. Still, it could be years before GPU prices fall enough that the average business can maintain its own AI infrastructure.</p><h2>The risk for Nvidia</h2><p>Since only a small number of tech giants and top AI start-ups are buying the majority of AI GPUs, Nvidia's sales are extremely concentrated at the moment.</p><div><div><div></div></div></div><p>In the fiscal 2025 second quarter, the company generated $30 billion in total revenue, which was up 122% from the year-ago period. The data center segment was responsible for $26.3 billion of that revenue, and that number grew by a whopping 154%. </p><p>According to Nvidia's 10-Q filing for the second quarter, four customers (who were not identified) accounted for almost half of its $30 billion in revenue:</p><div><table><thead><tr><th><p>Customer</p></th><th><p>Proportion of Nvidia's Q2 Revenue</p></th></tr></thead><tbody><tr><td><p>Customer A</p></td><td><p>14%</p></td></tr><tr><td><p>Customer B</p></td><td><p>11%</p></td></tr><tr><td><p>Customer C</p></td><td><p>11%</p></td></tr><tr><td><p>Customer D</p></td><td><p>10%</p></td></tr></tbody></table></div><p>Data source: Nvidia.</p><p>Nvidia only singles out the customers who account for 10% or more of its revenue, so it's possible there were other material buyers of its GPUs that didn't meet the reporting threshold.</p><div><div><div></div></div></div><p>Customers A and B accounted for a combined 25% of the company's revenue during Q2, which ticked higher from 24% in the fiscal 2025 first quarter just three months earlier. In other words, Nvidia's revenue is becoming more -- not less -- concentrated.</p><p>Here's why that could be a problem. Customer A spent $7.8 billion with Nvidia in the last two quarters alone, and only a tiny number of companies in the entire world can sustain that kind of spending on chips and infrastructure. That means even if one or two of Nvidia's top customers cut back on their spending, the company could suffer a loss in revenue that can't be fully replaced.</p><div><img loading=\"lazy\" src=\"https://g.foolcdn.com/image/?url=https%3A%2F%2Fg.foolcdn.com%2Feditorial%2Fimages%2F795438%2Fnvidias-headquarters-with-a-black-nvidia-sign-in-the-foreground.jpg&op=resize&w=700\" srcset=\"https://g.foolcdn.com/image/?url=https%3A//g.foolcdn.com/editorial/images/795438/nvidias-headquarters-with-a-black-nvidia-sign-in-the-foreground.jpg&w=300&op=resize 300w, https://g.foolcdn.com/image/?url=https%3A//g.foolcdn.com/editorial/images/795438/nvidias-headquarters-with-a-black-nvidia-sign-in-the-foreground.jpg&w=1000&op=resize 1000w, https://g.foolcdn.com/image/?url=https%3A//g.foolcdn.com/editorial/images/795438/nvidias-headquarters-with-a-black-nvidia-sign-in-the-foreground.jpg&w=2000&op=resize 2000w\"/><p>Image source: Nvidia.</p></div><h2>Nvidia's mystery customers</h2><p>Microsoft is a regular buyer of Nvidia's GPUs, but a recent report from one Wall Street analyst suggests the tech giant is the <em>biggest</em> customer of Blackwell hardware (which starts shipping at the end of this year) so far. As a result, I think Microsoft is Customer A.</p><p>Nvidia's other top customers could be some combination of Amazon, Alphabet, <strong>Meta Platforms</strong>, <strong>Oracle</strong>, <strong>Tesla</strong>, and OpenAI. According to public filings, here's how much money some of those companies are spending on AI infrastructure:</p><div><div><div></div></div></div><ul><li aria-level=\"1\">Microsoft allocated $55.7 billion to capital expenditures (capex) during fiscal 2024 (which ended June 30), and most of that went toward GPUs and building data centers. It plans to spend even more in fiscal 2025. </li><li aria-level=\"1\">Amazon's capex is on track to come in at over $60 billion during calendar 2024, which will support the growth it's seeing in AI. </li><li aria-level=\"1\">Meta Platforms plans to spend up to $40 billion on AI infrastructure in 2024 and even more in 2025, in order to build more advanced versions of its Llama AI models.</li><li aria-level=\"1\">Alphabet is on track to allocate around $50 billion to capex this year.</li><li aria-level=\"1\">Oracle allocated $6.9 billion toward AI capex in its fiscal 2024 year (which ended May 31), and it plans to spend <em>double</em> that in fiscal 2025. </li><li aria-level=\"1\">Tesla just told investors its total expenditures on AI infrastructure will top $11 billion this year, as it brings 50,000 Nvidia GPUs online to improve its self-driving software. </li></ul><p>Based on that information, Nvidia's revenue pipeline looks robust for at least the next year. The picture is a little more unclear as we look further into the future because we don't know how long those companies can keep up that level of spending.</p><p>Nvidia CEO Jensen Huang thinks data center operators will spend $1 trillion building AI infrastructure over the next five years. If he's right, the company could continue growing well into the late 2020s. But there is competition coming online that could steal some market share.</p><p><strong>Advanced Micro Devices </strong>released its own AI data center GPUs last year, and it plans to launch a new chip architecture to compete with Blackwell in the second half of 2025. Plus, Microsoft, Amazon, and Alphabet have designed their own data center chips, and although it could take time to erode Nvidia's technological advantage, that hardware will eventually be more cost-effective for them to use.</p><p>None of this is an immediate cause for concern for Nvidia's investors, but they should keep an eye on the company's revenue concentration in the upcoming quarters. If it continues to rise, that might create a higher risk of a steep decline in sales at some point in the future.</p><div></div></div></body></html>","source":"fool_stock","collect":0,"html":"<!DOCTYPE html>\n<html>\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n<meta name=\"viewport\" content=\"width=device-width,initial-scale=1.0,minimum-scale=1.0,maximum-scale=1.0,user-scalable=no\"/>\n<meta name=\"format-detection\" content=\"telephone=no,email=no,address=no\" />\n<title>46% of Nvidia's Revenue Came From 4 Mystery Customers Last Quarter</title>\n<style type=\"text/css\">\na,abbr,acronym,address,applet,article,aside,audio,b,big,blockquote,body,canvas,caption,center,cite,code,dd,del,details,dfn,div,dl,dt,\nem,embed,fieldset,figcaption,figure,footer,form,h1,h2,h3,h4,h5,h6,header,hgroup,html,i,iframe,img,ins,kbd,label,legend,li,mark,menu,nav,\nobject,ol,output,p,pre,q,ruby,s,samp,section,small,span,strike,strong,sub,summary,sup,table,tbody,td,tfoot,th,thead,time,tr,tt,u,ul,var,video{ font:inherit;margin:0;padding:0;vertical-align:baseline;border:0 }\nbody{ font-size:16px; line-height:1.5; color:#999; background:transparent; }\n.wrapper{ overflow:hidden;word-break:break-all;padding:10px; }\nh1,h2{ font-weight:normal; line-height:1.35; margin-bottom:.6em; }\nh3,h4,h5,h6{ line-height:1.35; margin-bottom:1em; }\nh1{ font-size:24px; }\nh2{ font-size:20px; }\nh3{ font-size:18px; }\nh4{ font-size:16px; }\nh5{ font-size:14px; }\nh6{ font-size:12px; }\np,ul,ol,blockquote,dl,table{ margin:1.2em 0; }\nul,ol{ margin-left:2em; }\nul{ list-style:disc; }\nol{ list-style:decimal; }\nli,li p{ margin:10px 0;}\nimg{ max-width:100%;display:block;margin:0 auto 1em; }\nblockquote{ color:#B5B2B1; border-left:3px solid #aaa; padding:1em; }\nstrong,b{font-weight:bold;}\nem,i{font-style:italic;}\ntable{ width:100%;border-collapse:collapse;border-spacing:1px;margin:1em 0;font-size:.9em; }\nth,td{ padding:5px;text-align:left;border:1px solid #aaa; }\nth{ font-weight:bold;background:#5d5d5d; }\n.symbol-link{font-weight:bold;}\n/* header{ border-bottom:1px solid #494756; } */\n.title{ margin:0 0 8px;line-height:1.3;color:#ddd; }\n.meta {color:#5e5c6d;font-size:13px;margin:0 0 .5em; }\na{text-decoration:none; color:#2a4b87;}\n.meta .head { display: inline-block; overflow: hidden}\n.head .h-thumb { width: 30px; height: 30px; margin: 0; padding: 0; border-radius: 50%; float: left;}\n.head .h-content { margin: 0; padding: 0 0 0 9px; float: left;}\n.head .h-name {font-size: 13px; color: #eee; margin: 0;}\n.head .h-time {font-size: 11px; color: #7E829C; margin: 0;line-height: 11px;}\n.small {font-size: 12.5px; display: inline-block; transform: scale(0.9); -webkit-transform: scale(0.9); transform-origin: left; -webkit-transform-origin: left;}\n.smaller {font-size: 12.5px; display: inline-block; transform: scale(0.8); -webkit-transform: scale(0.8); transform-origin: left; -webkit-transform-origin: left;}\n.bt-text {font-size: 12px;margin: 1.5em 0 0 0}\n.bt-text p {margin: 0}\n</style>\n</head>\n<body>\n<div class=\"wrapper\">\n<header>\n<h2 class=\"title\">\n46% of Nvidia's Revenue Came From 4 Mystery Customers Last Quarter\n</h2>\n\n<h4 class=\"meta\">\n\n\n2024-10-30 16:57 GMT+8 <a href=https://www.fool.com/investing/2024/10/30/46-nvidia-revenue-came-from-4-mystery-customers/><strong>Motley Fool</strong></a>\n\n\n</h4>\n\n</header>\n<article>\n<div>\n<p>Nvidia's market cap is up almost tenfold since the start of 2023 thanks to demand for its data center GPUs.\n\n\n\n\n\n\n\nNvidia's GPUs are the primary choice for developing artificial intelligence models. \n...</p>\n\n<a href=\"https://www.fool.com/investing/2024/10/30/46-nvidia-revenue-came-from-4-mystery-customers/\">Web Link</a>\n\n</div>\n\n\n</article>\n</div>\n</body>\n</html>\n","type":0,"thumbnail":"https://g.foolcdn.com/image/?url=https%3A%2F%2Fg.foolcdn.com%2Feditorial%2Fimages%2F795438%2Fnvidias-headquarters-with-a-black-nvidia-sign-in-the-foreground.jpg&op=resize&w=165&h=104","relate_stocks":{"CUBI":"Customers Bancorp Inc.","NVDA":"英伟达","AMZN":"亚马逊","MSFT":"微软","GOOG":"谷歌","META":"Meta Platforms, Inc."},"source_url":"https://www.fool.com/investing/2024/10/30/46-nvidia-revenue-came-from-4-mystery-customers/","is_english":true,"share_image_url":"https://static.laohu8.com/e9f99090a1c2ed51c021029395664489","article_id":"2479222466","content_text":"Nvidia's market cap is up almost tenfold since the start of 2023 thanks to demand for its data center GPUs.\n\n\n\n\n\n\n\nNvidia's GPUs are the primary choice for developing artificial intelligence models. \n\n\n\n\n\n\n\nThe company's sales are increasingly concentrated, which could pose a big risk to the company in the future. \n\nNvidia (NVDA 0.52%) had a market capitalization of $360 billion at the start of 2023. Less than two years later, it's now worth over $3.4 trillion. Although the company supplies graphics processing units (GPUs) for personal computers and even cars, the data center segment has been the primary source of its growth over that period.Nvidia's data center GPUs are the most powerful in the industry for developing and deploying artificial intelligence (AI) models. The company is struggling to keep up with demand from AI start-ups and the world's largest technology giants. While that's a great thing, there is a potential risk beneath the surface.Nvidia's financial results for its fiscal 2025 second quarter (which ended on July 28) showed that the company increasingly relies on a small handful of customers to generate sales. Here's why that could lead to vulnerabilities in the future.GPU ownership is a rich company's gameAccording to a study by McKinsey and Company, 72% of organizations worldwide are using AI in at least one business function. That number continues to grow, but most companies don't have the financial resources (or the expertise) to build their own AI infrastructure. After all, one of Nvidia's leading GPUs can cost up to $40,000, and it often takes thousands of them to train an AI model.Instead, tech giants like Microsoft (MSFT 1.26%), Amazon (AMZN 1.29%), and Alphabet (GOOG 1.66%) (GOOGL 1.77%) buy hundreds of thousands of GPUs and cluster them inside centralized data centers. Businesses can rent that computing capacity to deploy AI into their operations for a fraction of the cost of building their own infrastructure.Cloud companies like DigitalOcean are now making AI accessible to even the smallest businesses using that same strategy. DigitalOcean allows developers to access clusters of between one and eight Nvidia H100 GPUs, enough for very basic AI workloads.Affordability is improving. Nvidia's new Blackwell-based GB200 GPU systems are capable of performing AI inference at 30 times the pace of the older H100 systems. Each individual GB200 GPU is expected to sell for between $30,000 and $40,000, which is roughly the same price as the H100 when it was first released, so Blackwell offers an incredible improvement in cost efficiency. That means the most advanced, trillion-parameter large language models (LLMs) -- which have previously only been developed by well-resourced tech giants and leading AI start-ups like OpenAI and Anthropic -- will be financially accessible to a broader number of developers. Still, it could be years before GPU prices fall enough that the average business can maintain its own AI infrastructure.The risk for NvidiaSince only a small number of tech giants and top AI start-ups are buying the majority of AI GPUs, Nvidia's sales are extremely concentrated at the moment.In the fiscal 2025 second quarter, the company generated $30 billion in total revenue, which was up 122% from the year-ago period. The data center segment was responsible for $26.3 billion of that revenue, and that number grew by a whopping 154%. According to Nvidia's 10-Q filing for the second quarter, four customers (who were not identified) accounted for almost half of its $30 billion in revenue:CustomerProportion of Nvidia's Q2 RevenueCustomer A14%Customer B11%Customer C11%Customer D10%Data source: Nvidia.Nvidia only singles out the customers who account for 10% or more of its revenue, so it's possible there were other material buyers of its GPUs that didn't meet the reporting threshold.Customers A and B accounted for a combined 25% of the company's revenue during Q2, which ticked higher from 24% in the fiscal 2025 first quarter just three months earlier. In other words, Nvidia's revenue is becoming more -- not less -- concentrated.Here's why that could be a problem. Customer A spent $7.8 billion with Nvidia in the last two quarters alone, and only a tiny number of companies in the entire world can sustain that kind of spending on chips and infrastructure. That means even if one or two of Nvidia's top customers cut back on their spending, the company could suffer a loss in revenue that can't be fully replaced.Image source: Nvidia.Nvidia's mystery customersMicrosoft is a regular buyer of Nvidia's GPUs, but a recent report from one Wall Street analyst suggests the tech giant is the biggest customer of Blackwell hardware (which starts shipping at the end of this year) so far. As a result, I think Microsoft is Customer A.Nvidia's other top customers could be some combination of Amazon, Alphabet, Meta Platforms, Oracle, Tesla, and OpenAI. According to public filings, here's how much money some of those companies are spending on AI infrastructure:Microsoft allocated $55.7 billion to capital expenditures (capex) during fiscal 2024 (which ended June 30), and most of that went toward GPUs and building data centers. It plans to spend even more in fiscal 2025. Amazon's capex is on track to come in at over $60 billion during calendar 2024, which will support the growth it's seeing in AI. Meta Platforms plans to spend up to $40 billion on AI infrastructure in 2024 and even more in 2025, in order to build more advanced versions of its Llama AI models.Alphabet is on track to allocate around $50 billion to capex this year.Oracle allocated $6.9 billion toward AI capex in its fiscal 2024 year (which ended May 31), and it plans to spend double that in fiscal 2025. Tesla just told investors its total expenditures on AI infrastructure will top $11 billion this year, as it brings 50,000 Nvidia GPUs online to improve its self-driving software. Based on that information, Nvidia's revenue pipeline looks robust for at least the next year. The picture is a little more unclear as we look further into the future because we don't know how long those companies can keep up that level of spending.Nvidia CEO Jensen Huang thinks data center operators will spend $1 trillion building AI infrastructure over the next five years. If he's right, the company could continue growing well into the late 2020s. But there is competition coming online that could steal some market share.Advanced Micro Devices released its own AI data center GPUs last year, and it plans to launch a new chip architecture to compete with Blackwell in the second half of 2025. Plus, Microsoft, Amazon, and Alphabet have designed their own data center chips, and although it could take time to erode Nvidia's technological advantage, that hardware will eventually be more cost-effective for them to use.None of this is an immediate cause for concern for Nvidia's investors, but they should keep an eye on the company's revenue concentration in the upcoming quarters. If it continues to rise, that might create a higher risk of a steep decline in sales at some point in the future.","news_type":1},"isVote":1,"tweetType":1,"viewCount":195,"commentLimit":10,"likeStatus":false,"favoriteStatus":false,"reportStatus":false,"symbols":[],"verified":2,"subType":0,"readableState":1,"langContent":"CN","currentLanguage":"CN","warmUpFlag":false,"orderFlag":false,"shareable":true,"causeOfNotShareable":"","featuresForAnalytics":[],"commentAndTweetFlag":false,"andRepostAutoSelectedFlag":false,"upFlag":false,"length":47,"xxTargetLangEnum":"ZH_CN"},"commentList":[],"isCommentEnd":true,"isTiger":false,"isWeiXinMini":false,"url":"/m/post/365743577923776"}
精彩评论