1. The AI Gold Rush: Why Investors Are Pouring Money into the AI Economy
  2. How One Data, an AI software company, secured its Kubernetes environment using Kasten K10.
  3. Nvidia Unveils Ada Lovelace GPU Architecture, OVX Systems, and Omniverse Cloud
  4. AWS Announces Significant Bedrock Update: Expanded AI Models and Improved Flexibility for Users
  5. Combining AI and Human Expertise to Drive Growth in Banking
  6. Maximizing Security in Microsoft Azure: Advice for IT Administrators
  7. How to strike a balance between digital transformation and day-to-day operations
  8. VMware introduces developer, data, and security services for sovereign clouds.
  9. Cohesity integrates Intel’s confidential computing capabilities into Cohesity Data Cloud.
  10. How to Automate Your Infrastructure Management
  11. AI Video Platform Synthesia Secures $90M in Series C Funding, Backed by Nvidia
  12. Future-proof your business with cloud storage that’s sustainable for the planet
  13. Improving Your IT Infrastructure: The Right Time and Way to Integrate Refurbished Hardware
  14. Mining Companies Explore Repurposing Unused GPUs for HPC and AI Applications
  15. A Revenera survey reveals that insights into product usage drive the creation of better roadmaps.
  16. How Retailers Can Identify Fake AI
  17. Using Large Language Models to Predict Financial Markets
  18. The Future of Accounting: How AI is Revolutionizing Financial Operations
  19. Achieving successful digital transformation: Key lessons learned – from decisions to data.
  20. Sundar Pichai’s Interview Sparks Discussion on the Real Abilities of AI Chatbots
  21. Eseye: Ongoing Challenges in IoT Connectivity and Security
  22. Europe’s €200 Billion AI Investment: A Major Move for the Future
  23. Hammerspace Secures $56M to Redefine Data Orchestration
  24. Harnessing the Potential of Generative and Predictive AI in Marketing
  25. Apple scraps data protection tool for UK customers ChatGPT said: Apple cancels data protection tool for customers in the UK.
  26. AWS Brings the AI Heat: Project Rainier and GenAI Innovations Take the Lead
  27. Seamless Digital Strategy and Business Transformation
  28. GlobalData: Key Advances That Will Drive the Metaverse’s Success
  29. Shaping the Future of AI Systems at Meta
  30. VMware Introduces New Generative AI Tools and Strengthens Partnership with Nvidia
  31. The Alps Scientific Symposium Shines a Spotlight on AI’s Potential to Address Major Scientific Challenges
  32. Quantinuum Unveils Gen QAI: A Generative Quantum AI Framework
  33. Ververica Enhances Advanced Stream Processing Technology with the Launch of the ‘Powered By Ververica’ Program
  34. Turning Data Into Insight: Key Tips for Building a Strong Data Strategy
  35. SambaNova Unveils Next-Generation DataScale System
  36. IT leaders are increasingly opting for hybrid cloud strategies because of their flexibility, cost efficiency, and enhanced security.
  37. NSF Grants SDSC Funding for NAIRR Pilot Research on Nvidia’s DGX Cloud
  38. Cerebras Breaks Record in Molecular Dynamics with 1.1 Million Simulations per Second
  39. Edge SIM Introduced to Link IoT Devices with Cloud Providers Across 180+ Countries
  40. 5 Strategies to Optimize IT Projects
  41. Shaping Healthcare Workflows: The Impact of AI Agents Beyond Automation
  42. How Hyve Managed Hosting Guaranteed a Seamless Voting Experience at the National Television Awards
  43. A 30-Year Journey to Overnight Success: The Parallel Growth of AI and Quantum Computing
  44. Nvidia’s Compact Desktop AI Box with Powerful Unified GPU/CPU Memory
  45. The Future of AI in Medical Billing Auditing
  46. UK-based cloud consultancy Rebura has been acquired by global technology distributor Westcon-Comstor.
  47. Here are five strategies for parents to guide their children in using AI responsibly
  48. Intel Labs Unveils ‘Kapoho Point’ Board Powered by Loihi 2 Technology
  49. The Future of IoT in 2025: Digital Twins, Mesh Networks, Virtual Reality, and Beyond
  50. Transforming Insurance Premium Payments: From Legacy Systems to AI-Driven Solutions
  51. Nvidia Takes a New Approach to the Top500 List, Shifting Focus Away from GPUs
  52. N-able is growing its Technology Alliance Program to create a more open ecosystem designed specifically for Managed Service Providers (MSPs)
  53. How Sun Chemical achieved infrastructure cost savings of over 50% during acquisitions.
  54. Nurturing the Cycle of Innovation: The Role of HPC, Big Data, and AI Advancement
  55. Brian Cerchio of Losant: Unlocking the Full Potential of IoT
  56. Genesys Acquires Radarr Technologies to Enhance Customer Experience Integration
  57. 6 Common Mistakes CTOs Make When Leading Teams in the Early Stages of a Project
  58. The IT Support Guide: A Complete Resource for Understanding and Hiring IT Support for Your Business
  59. Why Content Professionals Should Fully Embrace AI Without Hesitation
  60. OpenAI Unveils New Initiative to Address Risks of ‘Superintelligent’ AI
  61. Creating custom document solutions with Fluent: A developer’s viewpoint.
  62. How AI and Quantum Computing Will Work Together: Insights from Quantinuum
  63. Raising the Bar: Why Quality and Service Excellence Are Essential for Success in Today’s Business World
  64. Google Cloud’s 2025 AI Trends: The Future of Search, Customer Experience, and Security
  65. NVIDIA GTC Highlights: The Future of Data Centers and Strategic Cloud Partnerships
  66. How AI is Transforming Scientific Research
  67. Minima and Inferrix Join Forces to Strengthen Security for Millions of Connected Devices
  68. Digital Transformation Approaches for CIOs of Midsize Enterprises
  69. The University of Texas at San Antonio has been awarded a $4 million grant by the National Science Foundation (NSF) to fund a cutting-edge neuromorphic computing initiative
  70. Is ChatGPT Losing Its Edge?
  71. NetApp has partnered with Google Cloud to enhance flexibility in cloud data storage.
  72. Countdown to Compliance: Getting the Financial Sector Ready for DORA and Responsible AI
  73. AI for Everyone: How Technology Can Promote Inclusivity
  74. DataQube Partners with NodeWeaver to Deliver Complete Edge Cloud Solutions
  75. IoT Fuels Digital Transformation
  76. NTT DATA and Google Cloud Strengthen AI Partnership Across the Asia Pacific Region
  77. Empowering IT leaders to drive business growth and future-proof the organization
  78. IT Systems in the Public Sector for Enhanced Operational Efficiency
  79. Luis Mirabal, Globalstar: How Satellite IoT Can Boost Efficiency and Cut Costs
  80. Oracle and AWS have teamed up to bring Oracle Database to the AWS cloud.
  81. The Great 8-bit Controversy in Artificial Intelligence
  82. Teradata will provide cloud analytics services to the Los Angeles Clippers and the Intuit Dome.
  83. Salesforce introduces Einstein Copilot for Tableau.
  84. Microsoft’s compact, palm-sized chip is paving the way for practical quantum computing, bringing this cutting-edge technology closer to reality than ever before.
  85. GigaIO’s New SuperNode Achieves Unprecedented AMD GPU Performance
  86. Google Playfully Pokes Fun at Nvidia’s Blackwell as It Eases TPU Competition
  87. HPE Acquires Pachyderm to Enhance Reproducibility in Machine Learning
  88. Revenera introduces a new monetization analytics dashboard.
  89. Variations in IT Service Providers Across Regions
  90. Over 40% of companies lose revenue due to technology downtime and cloud complexity.
  91. Kyndryl has partnered with Veeam to provide robust cyber resiliency solutions.
  92. GoodData announces a major update to FlexQuery, its groundbreaking analytics engine.
  93. aicas has introduced the Edge Device Portal, and is now accepting applications from pilot customers
  94. Ververica enhances its advanced stream processing technology with the launch of the “Powered By Ververica” program.
  95. AI Will Replace Jobs – And That’s a Positive Change
  96. Intel Discontinues Its ‘Blockscale’ Bitcoin Chip
  97. Generative AI Paving the Path for Local SEO
  98. How One Data Safeguarded Its Kubernetes Environment Using Kasten K10
  99. The Future of Cell Therapy: How AI is Paving the Way for New Advances
  100. Quantum and AI: Realistic Synergy or Just Hype?
  101. The Impact of Deep Learning on Finance
  102. Madoc Batters of Warner Hotels highlights that the most significant hurdle is driving meaningful change
  103. The world’s first bio-circular data center has been launched, using algae to generate energy
  104. The Growing Importance of Cybersecurity in the Age of Artificial Intelligence
  105. AI drives a nearly 30% rise in IT modernization spending, yet companies are unprepared for the data demands.
  106. Marc Andreessen Claims AI Will Be the Key to Saving the World
  107. 5 Common Digital Transformation Challenges and How CIOs Can Address Them
  108. Vonage and AWS utilize communication and network APIs to provide innovative solutions
  109. My micro wave happy sunday from Expand is too small to fit Expand.
  110. Embracing the Digital Revolution: Transforming Processing Services and Evaluating the Future
  111. Google, OpenAI, Microsoft, and Anthropic Join Forces to Launch the Frontier Model Forum for Responsible AI
  112. Fewer than 20% of IT professionals believe that cloud infrastructure fully meets their needs
  113. AI-powered clouds for achieving business goals and optimal results
  114. The DCIM software market is projected to grow to $3.63 billion by 2029
  115. Nokia Enhances IoT Capabilities for Industrial Edge Applications
  116. In EMEA, half of cloud expenses are spent on fees, yet the majority of businesses still plan to expand their cloud capacity
  117. ByteDance’s Strategy for AI Chip Access Raises Concerns About Export Control Effectiveness
  118. Particle Tachyon: Bringing the Power of Smartphones to IoT
  119. Sam Altman Confirms GPT-5 Won’t Launch This Year in Reddit AMA
  120. Google Unveils TPU v5e AI Chip Following Controversial Background
  121. Qlik introduces Data Flow to speed up the Data-to-Decisions process in Qlik Cloud Analytics.
  122. The Potato Principle: Why It’s Important for Everyone to Understand
  123. Generative AI is becoming a standard feature in cloud business models, with Azure leading the charge.
  124. Fire Safety Company Resolves Customer Service Challenges with Sabio’s AI-Driven Analytics Solution
  125. Challenges and Approaches for CIOs in the Digital Economy
  126. Leveraging AI in Property Management
  127. ROI for IoT and Edge Computing on the Rise, with Open Source Playing a Key Role
  128. A Decade-Long Market Analysis and Forecast for Liquid Cooling
  129. Nvidia has unveiled new enterprise reference architectures designed to help businesses build AI-driven operations, or “AI factories.”
  130. Transforming Finance: How Generative AI is Enhancing Operational Efficiency
  131. The Digital Operational Resilience Act: Compliance is Just the Beginning for Banks
  132. SAS extends its hosted managed services to AWS
  133. Operational Risks: Why the Banking Sector Remains Cautious About AI

GenAI made a big splash when ChatGPT launched on November 30, 2022. Since then, the drive for more powerful models has changed the way we approach hardware, data centers, and energy usage. The development of foundational models is still moving at a fast pace. One of the challenges in high-performance computing (HPC) and technical computing is figuring out where GenAI fits and, more importantly, what it means for future discoveries.

The major strain on resources has come from developing and training large AI models. The expected inference market, which involves deploying the models, will likely require different hardware and is expected to be much larger than the training market.

When it comes to HPC, there are some big questions. For example:

  • How can HPC benefit from GenAI?
  • How does GenAI fit in with traditional HPC tools and applications?
  • Can GenAI write code for HPC applications?
  • Can GenAI understand science and technology?

The answers to these questions are still being worked on, with many organizations, including the Trillion Parameter Consortium (TPC), exploring the role of GenAI in science and engineering.

One challenge with large language models (LLMs) is that they sometimes provide incorrect or misleading answers, which are often referred to as “hallucinations.” For example, when asked a basic chemistry question like “Will Water Freeze at 27 degrees F?” the answer given was clearly wrong. If GenAI is going to be used in science and technology, the models need to be improved.

So, what about using more data? The “intelligence” of early LLMs was improved by feeding them more data, which made the models larger and required more resources. Some benchmarks suggest that these models have gotten smarter, but there’s an issue: scaling models means finding more data. The internet has already been scraped for much of the data used, but the success of LLMs has also led to an increase in AI-generated content, such as news articles, summaries, and social media posts. Estimates suggest that about 10–15% of the internet’s textual content is now AI-generated, and by 2030, AI-generated content could make up more than half of it.

However, there’s a risk with this approach. When LLMs are trained on data generated by other AI models, their performance can degrade over time, a phenomenon known as “model collapse.” This could lead to a cycle where AI-generated content is continually used as input for future models, creating a feedback loop of poor-quality data.

Recently, tools like OpenAI’s Deep Research and Google’s Gemini Deep Research have made it easier for researchers to create reports by suggesting topics and conducting research. These tools gather information from the web and generate reports, which then become part of the data used to train future models.

What about the data generated by HPC? HPC is already producing massive amounts of data. Traditional HPC focuses on crunching numbers for mathematical models, and this data is typically unique, clean, and accurate. It’s also highly tunable, meaning it can be shaped to fit different needs, and the possibilities for generating data are almost limitless.

A good example of this is Microsoft’s Aurora, a weather model based on vast amounts of meteorological and climatic data. Aurora has trained on over a million hours of data, leading to a massive increase in computational speed. Aurora’s research shows that training AI models on a diverse range of data can improve their accuracy, with datasets ranging from hundreds of terabytes to a petabyte in size.

In the realm of science and engineering, we already work with numbers, vectors, and matrices, so the goal isn’t to predict words like LLMs do, but to predict numbers using Large Quantitative Models (LQMs). Building LQMs is more complex than LLMs and requires a deep understanding of the system being modeled, access to large datasets, and sophisticated computational tools. LQMs can be used in various industries, such as life sciences, energy, and finance, to simulate different scenarios and predict outcomes more quickly than traditional models.

Data management remains a challenge. While AI model generation can be computationally feasible without GPUs, it’s nearly impossible to do without proper storage and data management. A large portion of the time spent on data science projects goes into managing and processing the data, rather than running the models themselves.

In AI, there’s a concept known as the “Virtuous Cycle,” proposed by Andrew Ng. This cycle explains how AI companies use data generated from user activity to improve their models, which then attracts more users, generating even more data. This creates a self-reinforcing loop that accelerates progress.

A similar cycle exists in scientific and technical computing, where HPC, big data, and AI work together in a feedback loop. Scientific research generates vast amounts of data, which feeds AI models that analyze patterns and make predictions. These predictions can then lead to new research and the cycle continues.

As the Virtuous Cycle accelerates, it’s driving advancements in science and technology, but there are challenges to consider. The increasing demand for data and computational power raises questions about resource sustainability. There’s also the risk that the cycle might eventually “eat its own tail,” where the increasing demand for data could overwhelm the system.

The new era of HPC will likely be built on LLMs, LQMs, and other AI tools that use data models derived from both numerical and real-world data. As the cycle accelerates, the role of Big Data and quantum computing will become even more essential for training the next generation of models.

Despite the progress, there are still questions and challenges to address. The increasing demand for resources will put pressure on sustainability solutions, and the effectiveness of the Virtuous Cycle will depend on whether the cycle can continue to generate value without self-limiting.

0 Comments

Leave a Comment