Researchers want to embrace Arm's celebrated paradigm for a universal generative AI processor; a puzzling MEGA.mini core architecture
Mega and Mini cores, perfect harmony for generative AI

- New dual-core MEGA.mini architecture boosts performance while saving energy
- Dynamic core allocation optimizes workloads
- Mega cores for complex tasks and mini cores for routine processing
At February 2025's International Solid-State Circuits Conference (ISSCC), researchers unveiled a new MEGA.mini architecture.
Inspired by Arm’s celebrated "big.LITTLE" paradigm, this universal generative AI processor, discussed at length in 'MEGA.mini: A Universal Generative AI Processor with a New Big/Little Core Architecture for NPU', an academic paper presented at the conference, promised a revolutionary approach to neural processing unit (NPU) design.
Arm's big.LITTLE architecture has long been a staple of efficient mobile and embedded systems, balancing high-performance cores with energy-efficient ones to optimize power usage. The MEGA.mini project seeks to bring a similar dual-core philosophy to NPUs, which are essential for running AI models efficiently.
MEGA.mini: A game-changing NPU design
This approach will likely involve pairing high-capacity "Mega" cores for demanding tasks with lightweight "Mini" cores for routine processing. The primary goal of this design is to optimize power consumption while maximizing processing capabilities for various generative artificial intelligence (AI) tasks, ranging from natural language generation to complex reasoning.
Generative AI tool workloads, like those powering large language models or image synthesis systems, are notoriously resource-intensive. MEGA.mini's architecture aims to delegate complex tasks to Mega cores while offloading simpler operations to Mini cores, balancing speed, and power efficiency.
MEGA.mini also functions as a universal processor for generative AI. Unlike traditional fastest CPUs that require customization for specific AI tasks, MEGA.mini is being developed such that developers can leverage the architecture for different use cases, including natural language processing (NLP) and multimodal AI systems that integrate text, image, and audio processing.
It also optimizes workloads, whether running massive cloud-based AI models or compact edge AI applications, assisted by its support for multiple data types and formats, from traditional floating-point operations to emerging sparsity-aware computations.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
This universal approach could simplify AI development pipelines and improve deployment efficiency across platforms, from mobile devices to high-performance data centers.
The introduction of a dual-core architecture to NPUs is a significant departure from conventional designs — traditional NPUs often rely on a monolithic structure, which can lead to inefficiencies when processing varied AI tasks.
MEGA.mini's design addresses this limitation by creating cores specialized for specific types of operations. Mega cores are engineered for high-performance tasks like matrix multiplications and large-scale computations, essential for training and running sophisticated large language models (LLMs) while mini cores are optimized for low-power operations such as data pre-processing and inference tasks.
You might also like
- We've listed the best business computers available
- We've also listed the best AI video editors right now
- AI-powered no-code vs generative AI: who will win the race to build tomorrow's tech unicorns?
Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master's and a PhD in sciences, which provided him with a solid foundation in analytical thinking. Efosa developed a keen interest in technology policy, specifically exploring the intersection of privacy, security, and politics. His research delves into how technological advancements influence regulatory frameworks and societal norms, particularly concerning data protection and cybersecurity. Upon joining TechRadar Pro, in addition to privacy and technology policy, he is also focused on B2B security products. Efosa can be contacted at this email: udinmwenefosa@gmail.com
- Luke HughesStaff Writer
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.

















