1. The AI Gold Rush: Why Investors Are Pouring Money into the AI Economy
  2. How One Data, an AI software company, secured its Kubernetes environment using Kasten K10.
  3. Nvidia Unveils Ada Lovelace GPU Architecture, OVX Systems, and Omniverse Cloud
  4. AWS Announces Significant Bedrock Update: Expanded AI Models and Improved Flexibility for Users
  5. Combining AI and Human Expertise to Drive Growth in Banking
  6. Maximizing Security in Microsoft Azure: Advice for IT Administrators
  7. How to strike a balance between digital transformation and day-to-day operations
  8. VMware introduces developer, data, and security services for sovereign clouds.
  9. Cohesity integrates Intel’s confidential computing capabilities into Cohesity Data Cloud.
  10. How to Automate Your Infrastructure Management
  11. AI Video Platform Synthesia Secures $90M in Series C Funding, Backed by Nvidia
  12. Future-proof your business with cloud storage that’s sustainable for the planet
  13. Improving Your IT Infrastructure: The Right Time and Way to Integrate Refurbished Hardware
  14. Mining Companies Explore Repurposing Unused GPUs for HPC and AI Applications
  15. A Revenera survey reveals that insights into product usage drive the creation of better roadmaps.
  16. How Retailers Can Identify Fake AI
  17. Using Large Language Models to Predict Financial Markets
  18. The Future of Accounting: How AI is Revolutionizing Financial Operations
  19. Achieving successful digital transformation: Key lessons learned – from decisions to data.
  20. Sundar Pichai’s Interview Sparks Discussion on the Real Abilities of AI Chatbots
  21. Eseye: Ongoing Challenges in IoT Connectivity and Security
  22. Europe’s €200 Billion AI Investment: A Major Move for the Future
  23. Hammerspace Secures $56M to Redefine Data Orchestration
  24. Harnessing the Potential of Generative and Predictive AI in Marketing
  25. Apple scraps data protection tool for UK customers ChatGPT said: Apple cancels data protection tool for customers in the UK.
  26. AWS Brings the AI Heat: Project Rainier and GenAI Innovations Take the Lead
  27. Seamless Digital Strategy and Business Transformation
  28. GlobalData: Key Advances That Will Drive the Metaverse’s Success
  29. Shaping the Future of AI Systems at Meta
  30. VMware Introduces New Generative AI Tools and Strengthens Partnership with Nvidia
  31. The Alps Scientific Symposium Shines a Spotlight on AI’s Potential to Address Major Scientific Challenges
  32. Quantinuum Unveils Gen QAI: A Generative Quantum AI Framework
  33. Ververica Enhances Advanced Stream Processing Technology with the Launch of the ‘Powered By Ververica’ Program
  34. Turning Data Into Insight: Key Tips for Building a Strong Data Strategy
  35. SambaNova Unveils Next-Generation DataScale System
  36. IT leaders are increasingly opting for hybrid cloud strategies because of their flexibility, cost efficiency, and enhanced security.
  37. NSF Grants SDSC Funding for NAIRR Pilot Research on Nvidia’s DGX Cloud
  38. Cerebras Breaks Record in Molecular Dynamics with 1.1 Million Simulations per Second
  39. Edge SIM Introduced to Link IoT Devices with Cloud Providers Across 180+ Countries
  40. 5 Strategies to Optimize IT Projects
  41. Shaping Healthcare Workflows: The Impact of AI Agents Beyond Automation
  42. How Hyve Managed Hosting Guaranteed a Seamless Voting Experience at the National Television Awards
  43. A 30-Year Journey to Overnight Success: The Parallel Growth of AI and Quantum Computing
  44. Nvidia’s Compact Desktop AI Box with Powerful Unified GPU/CPU Memory
  45. The Future of AI in Medical Billing Auditing
  46. UK-based cloud consultancy Rebura has been acquired by global technology distributor Westcon-Comstor.
  47. Here are five strategies for parents to guide their children in using AI responsibly
  48. Intel Labs Unveils ‘Kapoho Point’ Board Powered by Loihi 2 Technology
  49. The Future of IoT in 2025: Digital Twins, Mesh Networks, Virtual Reality, and Beyond
  50. Transforming Insurance Premium Payments: From Legacy Systems to AI-Driven Solutions
  51. Nvidia Takes a New Approach to the Top500 List, Shifting Focus Away from GPUs
  52. N-able is growing its Technology Alliance Program to create a more open ecosystem designed specifically for Managed Service Providers (MSPs)
  53. How Sun Chemical achieved infrastructure cost savings of over 50% during acquisitions.
  54. Nurturing the Cycle of Innovation: The Role of HPC, Big Data, and AI Advancement
  55. Brian Cerchio of Losant: Unlocking the Full Potential of IoT
  56. Genesys Acquires Radarr Technologies to Enhance Customer Experience Integration
  57. 6 Common Mistakes CTOs Make When Leading Teams in the Early Stages of a Project
  58. The IT Support Guide: A Complete Resource for Understanding and Hiring IT Support for Your Business
  59. Why Content Professionals Should Fully Embrace AI Without Hesitation
  60. OpenAI Unveils New Initiative to Address Risks of ‘Superintelligent’ AI
  61. Creating custom document solutions with Fluent: A developer’s viewpoint.
  62. How AI and Quantum Computing Will Work Together: Insights from Quantinuum
  63. Raising the Bar: Why Quality and Service Excellence Are Essential for Success in Today’s Business World
  64. Google Cloud’s 2025 AI Trends: The Future of Search, Customer Experience, and Security
  65. NVIDIA GTC Highlights: The Future of Data Centers and Strategic Cloud Partnerships
  66. How AI is Transforming Scientific Research
  67. Minima and Inferrix Join Forces to Strengthen Security for Millions of Connected Devices
  68. Digital Transformation Approaches for CIOs of Midsize Enterprises
  69. The University of Texas at San Antonio has been awarded a $4 million grant by the National Science Foundation (NSF) to fund a cutting-edge neuromorphic computing initiative
  70. Is ChatGPT Losing Its Edge?
  71. NetApp has partnered with Google Cloud to enhance flexibility in cloud data storage.
  72. Countdown to Compliance: Getting the Financial Sector Ready for DORA and Responsible AI
  73. AI for Everyone: How Technology Can Promote Inclusivity
  74. DataQube Partners with NodeWeaver to Deliver Complete Edge Cloud Solutions
  75. IoT Fuels Digital Transformation
  76. NTT DATA and Google Cloud Strengthen AI Partnership Across the Asia Pacific Region
  77. Empowering IT leaders to drive business growth and future-proof the organization
  78. IT Systems in the Public Sector for Enhanced Operational Efficiency
  79. Luis Mirabal, Globalstar: How Satellite IoT Can Boost Efficiency and Cut Costs
  80. Oracle and AWS have teamed up to bring Oracle Database to the AWS cloud.
  81. The Great 8-bit Controversy in Artificial Intelligence
  82. Teradata will provide cloud analytics services to the Los Angeles Clippers and the Intuit Dome.
  83. Salesforce introduces Einstein Copilot for Tableau.
  84. Microsoft’s compact, palm-sized chip is paving the way for practical quantum computing, bringing this cutting-edge technology closer to reality than ever before.
  85. GigaIO’s New SuperNode Achieves Unprecedented AMD GPU Performance
  86. Google Playfully Pokes Fun at Nvidia’s Blackwell as It Eases TPU Competition
  87. HPE Acquires Pachyderm to Enhance Reproducibility in Machine Learning
  88. Revenera introduces a new monetization analytics dashboard.
  89. Variations in IT Service Providers Across Regions
  90. Over 40% of companies lose revenue due to technology downtime and cloud complexity.
  91. Kyndryl has partnered with Veeam to provide robust cyber resiliency solutions.
  92. GoodData announces a major update to FlexQuery, its groundbreaking analytics engine.
  93. aicas has introduced the Edge Device Portal, and is now accepting applications from pilot customers
  94. Ververica enhances its advanced stream processing technology with the launch of the “Powered By Ververica” program.
  95. AI Will Replace Jobs – And That’s a Positive Change
  96. Intel Discontinues Its ‘Blockscale’ Bitcoin Chip
  97. Generative AI Paving the Path for Local SEO
  98. How One Data Safeguarded Its Kubernetes Environment Using Kasten K10
  99. The Future of Cell Therapy: How AI is Paving the Way for New Advances
  100. Quantum and AI: Realistic Synergy or Just Hype?
  101. The Impact of Deep Learning on Finance
  102. Madoc Batters of Warner Hotels highlights that the most significant hurdle is driving meaningful change
  103. The world’s first bio-circular data center has been launched, using algae to generate energy
  104. The Growing Importance of Cybersecurity in the Age of Artificial Intelligence
  105. AI drives a nearly 30% rise in IT modernization spending, yet companies are unprepared for the data demands.
  106. Marc Andreessen Claims AI Will Be the Key to Saving the World
  107. 5 Common Digital Transformation Challenges and How CIOs Can Address Them
  108. Vonage and AWS utilize communication and network APIs to provide innovative solutions
  109. My micro wave happy sunday from Expand is too small to fit Expand.
  110. Embracing the Digital Revolution: Transforming Processing Services and Evaluating the Future
  111. Google, OpenAI, Microsoft, and Anthropic Join Forces to Launch the Frontier Model Forum for Responsible AI
  112. Fewer than 20% of IT professionals believe that cloud infrastructure fully meets their needs
  113. AI-powered clouds for achieving business goals and optimal results
  114. The DCIM software market is projected to grow to $3.63 billion by 2029
  115. Nokia Enhances IoT Capabilities for Industrial Edge Applications
  116. In EMEA, half of cloud expenses are spent on fees, yet the majority of businesses still plan to expand their cloud capacity
  117. ByteDance’s Strategy for AI Chip Access Raises Concerns About Export Control Effectiveness
  118. Particle Tachyon: Bringing the Power of Smartphones to IoT
  119. Sam Altman Confirms GPT-5 Won’t Launch This Year in Reddit AMA
  120. Google Unveils TPU v5e AI Chip Following Controversial Background
  121. Qlik introduces Data Flow to speed up the Data-to-Decisions process in Qlik Cloud Analytics.
  122. The Potato Principle: Why It’s Important for Everyone to Understand
  123. Generative AI is becoming a standard feature in cloud business models, with Azure leading the charge.
  124. Fire Safety Company Resolves Customer Service Challenges with Sabio’s AI-Driven Analytics Solution
  125. Challenges and Approaches for CIOs in the Digital Economy
  126. Leveraging AI in Property Management
  127. ROI for IoT and Edge Computing on the Rise, with Open Source Playing a Key Role
  128. A Decade-Long Market Analysis and Forecast for Liquid Cooling
  129. Nvidia has unveiled new enterprise reference architectures designed to help businesses build AI-driven operations, or “AI factories.”
  130. Transforming Finance: How Generative AI is Enhancing Operational Efficiency
  131. The Digital Operational Resilience Act: Compliance is Just the Beginning for Banks
  132. SAS extends its hosted managed services to AWS
  133. Operational Risks: Why the Banking Sector Remains Cautious About AI

Artificial intelligence (AI) is becoming increasingly prevalent in our daily lives, with AI-powered products and services in high demand. This rise in popularity, especially for large language models like ChatGPT and image generation models like Stable Diffusion, has also brought greater attention to the computational and environmental costs associated with AI, particularly in the area of deep learning.

The cost of deep learning is primarily influenced by three factors: the size and structure of the model, the processor it runs on, and how the data is represented. Over the years, AI models have grown larger, with their computational requirements doubling every 6-10 months. While processor power has improved, it hasn’t kept up with the increasing costs of AI models. As a result, researchers are exploring different ways to optimize the data representation to reduce these costs. The choice of data type has significant implications on power consumption, accuracy, and throughput. However, there’s no single best data type for AI, as the needs vary between the training and inference phases of deep learning.

Finding the Right Balance: Bit by Bit

One of the key methods to improve AI efficiency is through data quantization. Quantization reduces the number of bits required to represent the weights of a model, which makes the model smaller, speeds up computation, and lowers power consumption. This technique is critical for improving AI efficiency.

AI models are typically trained using 32-bit floating point (FP32) data. However, it has been found that not all 32 bits are necessary to maintain accuracy. Attempts to use 16-bit floating point (FP16) data types have shown early success, leading to efforts to find the minimum number of bits required for accuracy. Google introduced the 16-bit brain float (BF16), and models being prepared for inference are often quantized to 8-bit floating point (FP8) or integer (INT8) data types. There are two main approaches to quantizing a neural network: Post-Training Quantization (PTQ) and Quantization-Aware Training (QAT). Both methods aim to reduce numerical precision to improve efficiency, but they differ in timing and implementation, affecting accuracy.

Post-Training Quantization (PTQ) occurs after a model has been trained with higher-precision data (e.g., FP32 or FP16). The model’s weights and activations are then converted to lower-precision formats like FP8 or INT8. While this method is simple to apply, it can lead to accuracy loss, especially when using very low-precision formats, as the model hasn’t been trained to handle the errors that arise from quantization.

Quantization-Aware Training (QAT) incorporates quantization into the training process, allowing the model to adapt to the reduced precision. During training, both the forward and backward passes simulate quantized operations, adjusting the model to handle the reduced precision better. QAT generally results in better accuracy than PTQ, but it requires modifications to the training process, making it more complex to implement.

The Ongoing 8-bit Debate

The AI industry has largely settled on two primary candidates for quantized data types: INT8 and FP8, with hardware vendors taking sides. In 2022, a paper from Graphcore and AMD proposed an IEEE standard for FP8, and Intel, Nvidia, and Arm later followed with their own proposals. Other companies, like Qualcomm and Untether AI, have also explored FP8 and compared it to INT8. However, the debate about which data type is best for AI is far from settled. While there’s no universal answer, the choice between INT8 and FP8 often depends on the specific hardware and model architecture, as well as performance and accuracy requirements.

Integer vs. Floating Point

The key difference between floating point and integer data types lies in how they represent numbers. Floating point types are used for real numbers, which include both integers and fractions. These numbers can be represented in scientific notation, with a mantissa and exponent.

On the other hand, integer types are used to represent whole numbers, without fractions. The difference in representation leads to significant variations in precision and range. Floating point numbers have a wider dynamic range than integers, which have a smaller range and fixed precision.

For Training: During training, the primary goal is to update the model’s parameters through optimization, which requires a higher dynamic range to accurately propagate gradients and ensure the model converges. For this reason, floating point representations like FP32, FP16, and even FP8 are typically used during training to maintain sufficient dynamic range.

For Inference: Inference focuses on efficiently evaluating the trained model on new data, with priorities on minimizing computational complexity, memory usage, and energy consumption. In this phase, lower-precision representations like INT8 or FP8 are more suitable, as they reduce the computational burden while still maintaining enough accuracy for real-time performance.

Which Data Type for Inference?

The best data type for inference depends on the application and the hardware used. For real-time and mobile applications, smaller 8-bit data types are often preferred because they reduce memory usage, processing time, and power consumption, while still providing enough accuracy.

FP8 is gaining popularity, with major hardware vendors and cloud service providers integrating it into their deep learning solutions. FP8 comes in several variations, defined by the ratio of exponent bits to mantissa bits. The different types offer a tradeoff between dynamic range and precision. FP8 E3M4, with 3 exponent bits and 4 mantissa bits, has a smaller dynamic range but greater precision. FP8 E4M3 increases the exponent to enhance range, while FP8 E5M2 provides the highest dynamic range, making it ideal for training.

INT8, by comparison, has fewer exponent bits, which limits its dynamic range but improves precision. Whether FP8 or INT8 is better for accuracy depends on the AI model, and power efficiency will depend on the specific hardware. Research from Untether AI suggests that FP8 offers better accuracy and performance, while Qualcomm found that INT8 might be more efficient on their hardware. Ultimately, the choice between FP8 and INT8 comes down to the hardware capabilities and the specific needs of the model.

0 Comments

Leave a Comment