How to Buy
          preview_article_Nvidia_GTC_2024
          Thematic Investing

          Nvidia’s Innovative Leap: Blackwell Chip and the 2024 GTC Conference Highlights

          2 April 2024

          3 Min Read

          Related

          Key Takeaways

          Nvidia's Blackwell chip introduces groundbreaking computational power, significantly enhancing AI model training efficiency and reducing energy consumption.

          The 2024 GTC Conference highlighted Nvidia’s strategic advancements in AI, including the Blackwell platform's capability to drastically improve AI training and inference.

          Nvidia's innovations, including the Nvidia Inference Microservices, not only anticipate the surging demand for AI but also position the company as a leader in the AI revolution.

          In an era where technological innovation is not just advancing but accelerating, Nvidia has taken a significant leap forward, demonstrating its prowess and foresight at the forefront of the AI revolution. March 2024 marked two significant milestones for Nvidia: the announcement of its groundbreaking Blackwell chip and the successful hosting of its first in-person GPU Technology Conference (GTC) since 2019.

          AI-powered Robotics

          The Revolutionary Blackwell Chip

          Nvidia CEO Jensen Huang introduced the Blackwell chip, a marvel of engineering that promises to redefine computational boundaries.1 Delivering 2.5 times more FLOPs of computational power than its predecessor, the Hopper chip, Blackwell represents a paradigm shift in chip design.2 By ingeniously stitching two Blackwell “dies” together within the same package, Nvidia has unlocked substantial performance gains, albeit at increased complexity and cost. This innovative approach enables a Blackwell data centre to train a GPT-4 sized language model using roughly one-fourth the number of GPUs and energy compared to traditional data centres.

          Beyond raw power, Blackwell is engineered to maximize data transfer speeds within Nvidia’s vertically integrated server and data centre solutions, incorporating the latest high-bandwidth networking equipment.

          This design is critical for minimising bottlenecks and leveraging Blackwell’s immense computational capabilities to their fullest.

          Moreover, the chip brings significant advancements in AI inference workloads,3 offering a new 4-bit numeric precision format that boosts inference task performance by approximately 5x over Hopper.4

          As the demand for AI compute escalates, ARK Invest’s research indicates a potential explosion from $USD 50 billion in 2023 to an estimated $USD 2 trillion by 2030. Nvidia, propelled by Blackwell, is well-positioned to capitalize on these secular tailwinds, although the competitive landscape is expected to become increasingly fierce.

          GTC 2024: A Showcase of AI Ambition

          The 2024 GTC Conference was a landmark event, not just for Nvidia but for the entire tech industry, drawing an audience of ~300,000 people, both in-person and virtual.5 This conference showcased Nvidia’s Blackwell platform, highlighting its ability to triple AI training performance and enhance AI inference by an impressive 15x for GPT-4 sized models.6

          Nvidia’s readiness to meet the surging demand, exemplified by the rapid adoption of Hopper following the ChatGPT launch, signifies a strategic pivot. The company is better prepared to fulfil the needs of its largest customers, like Microsoft and Meta Platforms. Additionally, Nvidia unveiled its Nvidia Inference Microservices (NIMS), offering prepackaged AI models that promise to bolster its recurring software revenue significantly.

          Furthermore, the conference spotlighted Nvidia’s expanding ecosystem, with companies like ServiceNow and LinkedIn sharing insights on deploying generative AI solutions.7 These deployments are not just about enhancing efficiency but also about redefining how businesses operate, underscoring Nvidia’s role as a catalyst for widespread AI adoption.

           

          Chip, hardware

          Looking Forward

          As Nvidia strides into the future with its Blackwell chip and a successful GTC 2024 behind it, the implications for the tech industry and beyond are profound. The company’s innovations are setting new benchmarks for computational performance and AI capabilities, promising to drive transformative changes across multiple sectors. With an eye on both immediate impacts and long-term trends, Nvidia is not just participating in the AI revolution—it’s leading it.

          References

          1

          Nvidia. 2024. “Keynote by NVIDIA CEO Jensen Huang.”

          2

          Smith, R. 2024. “NVIDIA Blackwell Architecture and B200/B100 Accelerators Announced: Going Bigger with Smaller Data.” AnandTech.

          3

          “AI inference” refers to the process of running an AI model to generate language or image output and contrasts; “AI training” involves feeding large datasets to a model to teach it how to generate that output.

          4

          Smith, R. 2024. “NVIDIA Blackwell Architecture and B200/B100 Accelerators Announced: Going Bigger With Smaller Data.” AnandTech.

          5

          Nvidia. 2024. “See the Future at GTC 2024: NVIDIA’s Jensen Huang to Unveil Latest Breakthroughs in Accelerated Computing, Generative AI and Robotics.”

          6

          Nvidia. 2024. “NVIDIA HGX AI Supercomputer: The world’s leading AI computing platform.”

          7

          Nvidia. 2024. “Driving Enterprise Transformation: CIO Insights on Harnessing Generative AI’s Potential.”

          Related Documents

          Related Posts

          You are leaving europe.ark-funds.com

          By clicking below you acknowledge that you are navigating away from europe.ark-funds and will be connected to ark-funds.com. ARK Investment Management LLC manages both web domains. Please take note of ARK’s privacy policy, terms of use, and disclosures that may vary between sites.

          Cancel Proceed
          ======