Zayo Logo

The Future of AI and Infrastructure: Insights from Dr. Ayesha Khanna, CEO of Addo AI

Blog

|May 27, 2025

Where is AI heading, and how can businesses adapt to this technological transformation? This was one of the critical questions answered in the first webinar of our Built to be Bold series, featuring Dr. Ayesha Khanna, CEO and Co-founder of Addo AI, and Ethan Banks, Founder of Packet Pushers. The webinar explored how innovations in artificial intelligence are reshaping everything from how enterprises operate to the data center landscape to global network infrastructure while becoming increasingly accessible and affordable.

Keep reading to learn how AI adoption will impact your business and how to prepare for the AI-first economy, with the key takeaways from the webinar and practical insights into the future of AI and infrastructure.

AI Costs Are Decreasing and Adoption Is Expanding

One of the most exciting trends highlighted by Dr. Khanna is the rapid decrease in AI compute costs over the past 18 months. Thanks to competition among tech giants like NVIDIA, Google, and OpenAI, as well as the development of smarter algorithms and specialized hardware, AI has become significantly more affordable.

Lower costs allow more businesses (not just tech giants) to tap into the power of AI. Whether it’s a customer service chatbot or predictive analytics for inventory management, AI tools are becoming indispensable in helping businesses stay competitive.

Key Factors Driving Cost Reductions

  • Hardware Innovation: Specialized chips designed for tasks like inference computing are reducing computational overhead.
  • Smarter Mathematical Models: Models like DeepSeek are achieving complex outcomes at a fraction of the cost previously needed.
  • More Competition: Companies such as Anthropic, Microsoft, and Google are accelerating advancements to gain a competitive edge.

If you’ve been sitting on the sidelines, now might be the perfect time to explore AI-powered solutions for your organization.

The Critical Role of Network Infrastructure

AI doesn’t operate in a vacuum. Training and deploying modern AI models require robust network infrastructure to handle large volumes of data and ensure low latency. Dr. Khanna emphasized that businesses relying on AI often need to fine-tune these models with proprietary data stored in the cloud or across hybrid environments.

Why This Matters for Businesses:

  • Connectivity is key: Reliable, low-latency data transfer is essential for real-time AI applications, such as autonomous logistics or healthcare diagnostics.
  • Scalability: Legacy infrastructure may not support the needs of advanced AI tools, making network upgrades crucial.
  • Avoiding bottlenecks: Idle AI clusters can waste valuable compute resources and delay project timelines due to insufficient bandwidth.

Ethan Banks said it best, “If you’re assuming that you can just map AI as a new business application on top of your existing infrastructure, the best I can say is maybe, and it depends. You need to understand deeply what’s going on with AI and then understand what your network is capable of delivering to know that it has the characteristics required for effective delivery of artificial intelligence as a service for your business.”

For enterprises looking to fully harness AI’s potential, investing in robust, modern network infrastructure is no longer optional.

Democratizing AI Through Open-Source Models

If the idea of training massive AI models seems daunting, don’t worry. Open-source models and smaller language models are leveling the playing field.

Dr. Khanna explains two advantages of open-source models: “One, you’re not paying a per-token price to access it. The second advantage of something like DeepSeek is they’re using a different kind of architecture called Mixture of Experts (MoE), and they’re using some kind of gating phenomenon, which means they reduce the computational overhead and therefore the cost of using these models.”

Consider These Tools:

  • Llama from Meta: Open-source large language models designed with efficiency in mind.
  • DeepSeek: A cost-effective alternative that uses innovative architecture to lower resource consumption.
  • Smaller Models for Specific Use Cases: Businesses that only need domain-specific capabilities (like HR chatbots) can deploy smaller, pre-trained models with ease.

By leveraging open-source tools, organizations can skip the monumental effort of building AI systems from scratch. Instead, they can focus on fine-tuning these models to meet specific needs.

Government Subsidies and Chip Diplomacy

While businesses are taking the lead in adopting AI, governments also play a crucial role in ensuring accessibility. Subsidies and international agreements surrounding AI hardware (often referred to as chip diplomacy) are helping developing nations access cutting-edge technologies.

Strategic Data Storage and Edge Computing

Efficient storage solutions are vital when dealing with the vast datasets required for AI systems. Edge computing, where data is processed closer to its source, in conjunction with cold storage for less critical information, can significantly lower costs and reduce latency.

Storage Strategies for AI Success:

  • Cold Storage: Ideal for archiving vast amounts of data that aren’t in constant use.
  • Edge Computing: Reduces the distance data must travel, improving response times for critical applications.
  • Hybrid Cloud Architectures: Combine on-site and cloud solutions for better cost management and scalability.

Strategic storage planning is crucial for businesses looking to balance performance and expense.

Consistent AI Model Performance

AI models aren’t static. They need regular updates, whether it’s incorporating new data or adapting to updated base frameworks. Without retraining or fine-tuning, older models can become less accurate over time.

How to Maintain Model Performance:

  1. Update Base Models: Adopt the newest versions of models while preserving key features of older ones.
  2. Consistency Checks: Validate newer models against your business requirements to ensure accuracy.
  3. Iterative Training: Regularly integrate updated datasets to keep models robust and relevant.

Consistency is the key to leveraging long-term value from AI systems in a constantly evolving landscape.

Preparing for the Future AI Infrastructure

The demand for data centers and AI-specific infrastructure will skyrocket as adoption grows. Building superclusters and mega data centers, along with adopting renewable energy sources like modular nuclear reactors, will be crucial for meeting these needs.

What Lies Ahead:

  1. Data Center Expansion: More facilities are required to accommodate increased computational power.
  2. Sustainable Power: Data centers are expected to consume significant energy. Nuclear power may address future demands.
  3. Advanced Network Capabilities: Engineers must adopt ultra-fast, low-latency networks to support distributed AI systems.

Building AI-First Enterprises

AI has moved beyond the realm of possibility into tangible applications, reshaping industries globally. Whether you’re a small startup leveraging open-source tools or a multinational corporation fine-tuning proprietary models, now is the time to evaluate your network infrastructure and invest in the future.

For more insights and actionable strategies, access our full webinar here.

Curious about how AI, innovation, and bandwidth are reshaping business?

Register for our Built to be Bold webinar series.