Tricentis unveiled its vision for the future of AI-powered quality engineering, a unified AI workspace and agentic ecosystem that brings together Tricentis’ portfolio of AI agents, Model Context Protocol (MCP) servers and AI platform services, creating a centralized hub for managing quality at the speed and scale of modern innovation.
Mirantis announced availability of Mirantis OpenStack for Kubernetes (MOSK) 25.2 that simplifies cloud operations and strengthens support for GPU-intensive AI workloads as well as traditional enterprise applications.
MOSK 25.2 provides support for disconnected operations, simplified scale-out networking, and updates tailored for GPU-intensive and hybrid deployments.
"AI workloads mean big changes to general-purpose compute infrastructure," said Artem Andreev, Senior Engineering Manager, Mirantis. "With the latest MOSK, organizations can scale GPU-powered workloads, along with the ability to support secure, disconnected operations that don’t sacrifice openness or flexibility."
MOSK 25.2 makes it possible to run OpenStack clouds entirely offline where Internet access is prohibited. This supports sectors such as finance, government, and defense, where every artifact must be scanned and approved before entering the datacenter. Disconnected operations enable organizations to align with upstream innovation over time while preserving complete control of sensitive data—a key requirement for AI model training and sovereignty.
The release also delivers significant advances in networking, core platform components, observability, and user features, including:
- OpenStack 2025.1 "Epoxy" is supported for new deployments and upgrades from 2024.1 “Caracal”.
- Open Virtual Network (OVN) 24.03 delivers performance improvements and the latest security patches, plus a clear, validated path to move off Open vSwitch (OvS)—the long-used networking backend in OpenStack—toward a more modern and scalable model. As an alternative, OpenSDN 24.1 is available with a modernized codebase and expanded IPv6 capabilities.
- Scale-out networking & proactive network health — full L3 networking on bare metal to scale across racks without VLAN stretch, plus network infrastructure monitoring with connectivity checks and alerts to catch switch/routing issues early.
- Hybrid AI infrastructure (VMs + Bare Metal) — features that make AI clouds easier to run: allowing to recover bare-metal GPU servers even if networking breaks, and to connect them to the right project networks alongside VMs for high-performance training.
MOSK continues to enable enterprises to run on-premises private clouds for both cloud-native and traditional workloads with reliability, automation, and complete control over application data. The platform manages the full lifecycle of infrastructure—from bare-metal provisioning to software configuration—while providing centralized logging, monitoring, and alerting.
Industry News
Kong announced new support to help enterprises adopt and scale MCP and agentic AI development.
Copado unveiled new updates to its Intelligent DevOps Platform for Salesforce, bringing AI-powered automation, Org Intelligence™, and a new Model Context Protocol (MCP) integration framework that connects enterprise systems and grounds AI agents in live context without silos or duplication.
Xray announced the launch of AI-powered testing capabilities, a new suite of human-in-the-loop intelligence features powered by the Sembi IQ platform.
Redis announced the acquisition of Featureform, a framework for managing, defining, and orchestrating structured data signals.
CleanStart announced the expansion of its Docker Hub community of free vulnerability-free container images, surpassing 50 images, each refreshed daily to give developers access to current container builds.
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the graduation of Knative, a serverless, event-driven application layer on top of Kubernetes.
Sonatype announced the launch of Nexus Repository available in the cloud, the fully managed SaaS version of its artifact repository manager.
Spacelift announced Spacelift Intent, a new agentic, open source deployment model that enables the provisioning of cloud infrastructure through natural language without needing to write or maintain HCL.
IBM announced a strategic partnership to accelerate the development of enterprise-ready AI by infusing Anthropic’s Claude, one of the world’s most powerful family of large language models (LLMs), into IBM’s software portfolio to deliver measurable productivity gains, while building security, governance, and cost controls directly into the lifecycle of software development.
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced its intent to launch the React Foundation.
Appvance announced a new feature in its AIQ platform: automatic generation of API test data and scripts directly from OpenAPI specifications using generative AI.
Mirantis announced availability of Mirantis OpenStack for Kubernetes (MOSK) 25.2 that simplifies cloud operations and strengthens support for GPU-intensive AI workloads as well as traditional enterprise applications.
Cycloid released a new model context protocol (MCP) compliant server that can undertake a range of platform actions, allowing users to interact with the MCP using natural language via an LLM (Large Language Model).
The Adaptavist Group announced the acquisition of D|OPS Digital, a DevSecOps consultancy that increases the efficiency and speed of software delivery.