1. The AI Gold Rush: Why Investors Are Pouring Money into the AI Economy
  2. How One Data, an AI software company, secured its Kubernetes environment using Kasten K10.
  3. Nvidia Unveils Ada Lovelace GPU Architecture, OVX Systems, and Omniverse Cloud
  4. AWS Announces Significant Bedrock Update: Expanded AI Models and Improved Flexibility for Users
  5. Combining AI and Human Expertise to Drive Growth in Banking
  6. Maximizing Security in Microsoft Azure: Advice for IT Administrators
  7. How to strike a balance between digital transformation and day-to-day operations
  8. VMware introduces developer, data, and security services for sovereign clouds.
  9. Cohesity integrates Intel’s confidential computing capabilities into Cohesity Data Cloud.
  10. How to Automate Your Infrastructure Management
  11. AI Video Platform Synthesia Secures $90M in Series C Funding, Backed by Nvidia
  12. Future-proof your business with cloud storage that’s sustainable for the planet
  13. Improving Your IT Infrastructure: The Right Time and Way to Integrate Refurbished Hardware
  14. Mining Companies Explore Repurposing Unused GPUs for HPC and AI Applications
  15. A Revenera survey reveals that insights into product usage drive the creation of better roadmaps.
  16. How Retailers Can Identify Fake AI
  17. Using Large Language Models to Predict Financial Markets
  18. The Future of Accounting: How AI is Revolutionizing Financial Operations
  19. Achieving successful digital transformation: Key lessons learned – from decisions to data.
  20. Sundar Pichai’s Interview Sparks Discussion on the Real Abilities of AI Chatbots
  21. Eseye: Ongoing Challenges in IoT Connectivity and Security
  22. Europe’s €200 Billion AI Investment: A Major Move for the Future
  23. Hammerspace Secures $56M to Redefine Data Orchestration
  24. Harnessing the Potential of Generative and Predictive AI in Marketing
  25. Apple scraps data protection tool for UK customers ChatGPT said: Apple cancels data protection tool for customers in the UK.
  26. AWS Brings the AI Heat: Project Rainier and GenAI Innovations Take the Lead
  27. Seamless Digital Strategy and Business Transformation
  28. GlobalData: Key Advances That Will Drive the Metaverse’s Success
  29. Shaping the Future of AI Systems at Meta
  30. VMware Introduces New Generative AI Tools and Strengthens Partnership with Nvidia
  31. The Alps Scientific Symposium Shines a Spotlight on AI’s Potential to Address Major Scientific Challenges
  32. Quantinuum Unveils Gen QAI: A Generative Quantum AI Framework
  33. Ververica Enhances Advanced Stream Processing Technology with the Launch of the ‘Powered By Ververica’ Program
  34. Turning Data Into Insight: Key Tips for Building a Strong Data Strategy
  35. SambaNova Unveils Next-Generation DataScale System
  36. IT leaders are increasingly opting for hybrid cloud strategies because of their flexibility, cost efficiency, and enhanced security.
  37. NSF Grants SDSC Funding for NAIRR Pilot Research on Nvidia’s DGX Cloud
  38. Cerebras Breaks Record in Molecular Dynamics with 1.1 Million Simulations per Second
  39. Edge SIM Introduced to Link IoT Devices with Cloud Providers Across 180+ Countries
  40. 5 Strategies to Optimize IT Projects
  41. Shaping Healthcare Workflows: The Impact of AI Agents Beyond Automation
  42. How Hyve Managed Hosting Guaranteed a Seamless Voting Experience at the National Television Awards
  43. A 30-Year Journey to Overnight Success: The Parallel Growth of AI and Quantum Computing
  44. Nvidia’s Compact Desktop AI Box with Powerful Unified GPU/CPU Memory
  45. The Future of AI in Medical Billing Auditing
  46. UK-based cloud consultancy Rebura has been acquired by global technology distributor Westcon-Comstor.
  47. Here are five strategies for parents to guide their children in using AI responsibly
  48. Intel Labs Unveils ‘Kapoho Point’ Board Powered by Loihi 2 Technology
  49. The Future of IoT in 2025: Digital Twins, Mesh Networks, Virtual Reality, and Beyond
  50. Transforming Insurance Premium Payments: From Legacy Systems to AI-Driven Solutions
  51. Nvidia Takes a New Approach to the Top500 List, Shifting Focus Away from GPUs
  52. N-able is growing its Technology Alliance Program to create a more open ecosystem designed specifically for Managed Service Providers (MSPs)
  53. How Sun Chemical achieved infrastructure cost savings of over 50% during acquisitions.
  54. Nurturing the Cycle of Innovation: The Role of HPC, Big Data, and AI Advancement
  55. Brian Cerchio of Losant: Unlocking the Full Potential of IoT
  56. Genesys Acquires Radarr Technologies to Enhance Customer Experience Integration
  57. 6 Common Mistakes CTOs Make When Leading Teams in the Early Stages of a Project
  58. The IT Support Guide: A Complete Resource for Understanding and Hiring IT Support for Your Business
  59. Why Content Professionals Should Fully Embrace AI Without Hesitation
  60. OpenAI Unveils New Initiative to Address Risks of ‘Superintelligent’ AI
  61. Creating custom document solutions with Fluent: A developer’s viewpoint.
  62. How AI and Quantum Computing Will Work Together: Insights from Quantinuum
  63. Raising the Bar: Why Quality and Service Excellence Are Essential for Success in Today’s Business World
  64. Google Cloud’s 2025 AI Trends: The Future of Search, Customer Experience, and Security
  65. NVIDIA GTC Highlights: The Future of Data Centers and Strategic Cloud Partnerships
  66. How AI is Transforming Scientific Research
  67. Minima and Inferrix Join Forces to Strengthen Security for Millions of Connected Devices
  68. Digital Transformation Approaches for CIOs of Midsize Enterprises
  69. The University of Texas at San Antonio has been awarded a $4 million grant by the National Science Foundation (NSF) to fund a cutting-edge neuromorphic computing initiative
  70. Is ChatGPT Losing Its Edge?
  71. NetApp has partnered with Google Cloud to enhance flexibility in cloud data storage.
  72. Countdown to Compliance: Getting the Financial Sector Ready for DORA and Responsible AI
  73. AI for Everyone: How Technology Can Promote Inclusivity
  74. DataQube Partners with NodeWeaver to Deliver Complete Edge Cloud Solutions
  75. IoT Fuels Digital Transformation
  76. NTT DATA and Google Cloud Strengthen AI Partnership Across the Asia Pacific Region
  77. Empowering IT leaders to drive business growth and future-proof the organization
  78. IT Systems in the Public Sector for Enhanced Operational Efficiency
  79. Luis Mirabal, Globalstar: How Satellite IoT Can Boost Efficiency and Cut Costs
  80. Oracle and AWS have teamed up to bring Oracle Database to the AWS cloud.
  81. The Great 8-bit Controversy in Artificial Intelligence
  82. Teradata will provide cloud analytics services to the Los Angeles Clippers and the Intuit Dome.
  83. Salesforce introduces Einstein Copilot for Tableau.
  84. Microsoft’s compact, palm-sized chip is paving the way for practical quantum computing, bringing this cutting-edge technology closer to reality than ever before.
  85. GigaIO’s New SuperNode Achieves Unprecedented AMD GPU Performance
  86. Google Playfully Pokes Fun at Nvidia’s Blackwell as It Eases TPU Competition
  87. HPE Acquires Pachyderm to Enhance Reproducibility in Machine Learning
  88. Revenera introduces a new monetization analytics dashboard.
  89. Variations in IT Service Providers Across Regions
  90. Over 40% of companies lose revenue due to technology downtime and cloud complexity.
  91. Kyndryl has partnered with Veeam to provide robust cyber resiliency solutions.
  92. GoodData announces a major update to FlexQuery, its groundbreaking analytics engine.
  93. aicas has introduced the Edge Device Portal, and is now accepting applications from pilot customers
  94. Ververica enhances its advanced stream processing technology with the launch of the “Powered By Ververica” program.
  95. AI Will Replace Jobs – And That’s a Positive Change
  96. Intel Discontinues Its ‘Blockscale’ Bitcoin Chip
  97. Generative AI Paving the Path for Local SEO
  98. How One Data Safeguarded Its Kubernetes Environment Using Kasten K10
  99. The Future of Cell Therapy: How AI is Paving the Way for New Advances
  100. Quantum and AI: Realistic Synergy or Just Hype?
  101. The Impact of Deep Learning on Finance
  102. Madoc Batters of Warner Hotels highlights that the most significant hurdle is driving meaningful change
  103. The world’s first bio-circular data center has been launched, using algae to generate energy
  104. The Growing Importance of Cybersecurity in the Age of Artificial Intelligence
  105. AI drives a nearly 30% rise in IT modernization spending, yet companies are unprepared for the data demands.
  106. Marc Andreessen Claims AI Will Be the Key to Saving the World
  107. 5 Common Digital Transformation Challenges and How CIOs Can Address Them
  108. Vonage and AWS utilize communication and network APIs to provide innovative solutions
  109. My micro wave happy sunday from Expand is too small to fit Expand.
  110. Embracing the Digital Revolution: Transforming Processing Services and Evaluating the Future
  111. Google, OpenAI, Microsoft, and Anthropic Join Forces to Launch the Frontier Model Forum for Responsible AI
  112. Fewer than 20% of IT professionals believe that cloud infrastructure fully meets their needs
  113. AI-powered clouds for achieving business goals and optimal results
  114. The DCIM software market is projected to grow to $3.63 billion by 2029
  115. Nokia Enhances IoT Capabilities for Industrial Edge Applications
  116. In EMEA, half of cloud expenses are spent on fees, yet the majority of businesses still plan to expand their cloud capacity
  117. ByteDance’s Strategy for AI Chip Access Raises Concerns About Export Control Effectiveness
  118. Particle Tachyon: Bringing the Power of Smartphones to IoT
  119. Sam Altman Confirms GPT-5 Won’t Launch This Year in Reddit AMA
  120. Google Unveils TPU v5e AI Chip Following Controversial Background
  121. Qlik introduces Data Flow to speed up the Data-to-Decisions process in Qlik Cloud Analytics.
  122. The Potato Principle: Why It’s Important for Everyone to Understand
  123. Generative AI is becoming a standard feature in cloud business models, with Azure leading the charge.
  124. Fire Safety Company Resolves Customer Service Challenges with Sabio’s AI-Driven Analytics Solution
  125. Challenges and Approaches for CIOs in the Digital Economy
  126. Leveraging AI in Property Management
  127. ROI for IoT and Edge Computing on the Rise, with Open Source Playing a Key Role
  128. A Decade-Long Market Analysis and Forecast for Liquid Cooling
  129. Nvidia has unveiled new enterprise reference architectures designed to help businesses build AI-driven operations, or “AI factories.”
  130. Transforming Finance: How Generative AI is Enhancing Operational Efficiency
  131. The Digital Operational Resilience Act: Compliance is Just the Beginning for Banks
  132. SAS extends its hosted managed services to AWS
  133. Operational Risks: Why the Banking Sector Remains Cautious About AI

Customers now have access to Google’s custom hardware, including its Axion CPU and the latest Trillium TPU, through its Cloud service. Along with this, Google gave a sneak peek at Nvidia’s Blackwell platform, which is set to join Google Cloud early next year.

Mark Lohmeyer, Vice President and General Manager for Compute and AI Infrastructure at Google Cloud, mentioned in a blog post that the company is excited about the potential of Nvidia’s Blackwell GB200 NVL72 GPUs, and they look forward to providing more updates soon.

Google is already preparing its cloud infrastructure to support Blackwell, and it seems the company is moving away from the previous head-to-head comparisons between its TPUs and Nvidia’s GPUs. Instead, Google is making efforts to integrate Nvidia’s AI hardware more seamlessly into its cloud, with the introduction of a new network adapter designed to connect with Nvidia’s hardware.

Google aims to create a smooth and unified hardware and software experience in its cloud service, ensuring that customers can use different technologies without disruption.

This shift in approach is part of a broader trend in the chip industry, where rivals are putting aside their differences. AMD and Intel recently collaborated to keep x86 relevant in the AI space, and now Google is positioning itself to offer both its hardware and Nvidia’s hardware for inference, recognizing that diversity in cloud services is beneficial for business.

The demand for AI hardware is massive, and Nvidia’s GPUs are in short supply. As a result, customers are increasingly turning to Google’s TPUs.

Google’s new Trillium TPU, which replaces the TPU v5, is now available for preview and offers significant performance gains. Trillium is essentially a TPU v6, which was introduced just a year after the TPU v5. This rapid progression is impressive, considering the usual three- to four-year gap between previous generations.

Trillium delivers up to 4.7 times more peak compute performance than the TPU v5e when processing BF16 data. This translates to a theoretical peak performance of 925.9 teraflops for Trillium, compared to the 197 teraflops of the TPU v5e. However, real-world performance is always lower than theoretical estimates.

Google has shared several real-world AI benchmarks to highlight Trillium’s improvements. For example, the text-to-image Stable Diffusion XL inference is 3.1 times faster on Trillium than on the TPU v5e. Training on the Gemma2 model with 27 billion parameters is four times faster, while training on the 175-billion parameter GPT3 is about three times faster.

Trillium also features numerous enhancements, including double the HBM memory of the TPU v5e, which had 16GB of HBM2 capacity. Google didn’t specify whether Trillium uses HBM3 or HBM3e, which are found in Nvidia’s H200 and Blackwell GPUs. HBM3e offers greater memory bandwidth than HBM2.

Additionally, Trillium’s inter-chip communication has been doubled compared to the TPU v5e. Google’s infrastructure allows for the creation of supercomputers using tens of thousands of Trillium chips, with a technology called Multislice that distributes large AI workloads across multiple chips, ensuring high efficiency and uptime.

Trillium also benefits from third-generation SparseCores, a chip positioned near the high-bandwidth memory for more efficient AI processing.

Google’s Axion CPUs, designed to pair with Trillium, are now available for use in virtual machines (VMs) for AI inference. These ARM-based CPUs are offered in Google’s C4A VM instances and promise 65% better price-performance and up to 60% better energy efficiency than similar x86-based instances for tasks like web-serving, analytics, and database management.

However, it’s worth noting that for more demanding workloads, such as databases and ERP applications, a more powerful x86 chip may be needed. Independent benchmark comparisons between Google Cloud Axion and x86 instances are available from Phoronix.

Nvidia’s H200 GPU is now available in Google Cloud’s A3 Ultra virtual machines, and Google has developed a direct connection between its hardware infrastructure and Nvidia’s hardware via high-speed networking. The core of this system is Titanium, a hardware interface designed to optimize workload, traffic, and security management.

Google has introduced a new Titanium ML network adapter, which builds on Nvidia’s ConnectX-7 hardware to support Virtual Private Clouds (VPCs), traffic encryption, and virtualization. Lohmeyer noted that while Titanium’s capabilities benefit AI infrastructure, the unique performance needs of AI workloads require special consideration for accelerator-to-accelerator communication.

The Titanium ML adapter creates a virtualization layer that allows Google Cloud to run a virtual private cloud environment while leveraging Nvidia’s hardware for AI workloads. However, it’s still unclear whether the Titanium ML interface will enable customers to switch between Google’s Trillium and Nvidia GPUs within the same AI workloads. Lohmeyer previously mentioned that this could be made possible through containers, though Google has yet to provide further details.

Nvidia’s hardware offers a proven blueprint for GPU-optimized offload systems, and Google has its own system for managing GPU workloads in its cloud. The Hypercomputer interface, for instance, includes a “Calendar” consumption model for scheduling tasks, along with a “Flex Start” model that guarantees task completion and delivery times.

Lastly, Google announced the Hypercluster, a system that enables customers to deploy predefined AI and HPC workloads with a single API call. This system automates network, storage, and compute management, which can often be complex to handle manually. Google is also adopting AWS’s SLURM (Simple Linux Utility for Resource Management) scheduler to give customers more control over their HPC clusters, though further details on its integration into Hypercluster are yet to be revealed.

0 Comments

Leave a Comment