Amazon co-founder MacKenzie Scott has donated over $19 billion to charity in just five years
Diamond batteries powered by nuclear waste promise 28,000 years of clean energy
3 GB of RAM “Locked” for Gemini
The finding, reported by Android Authority, indicates that about 3.5 GB of the total RAM in the Pixel 10 is partitioned and perpetually allocated to the “AI Core” service and the AI processor (TPU) within the Tensor G5 chip.
The rationale behind this decision is straightforward: AI models, even “lightweight” ones like Gemini Nano, are bulky and slow to load into memory. By keeping the model permanently pre-loaded, Google ensures that all its AI features (like live translation, “Magic Cue”, etc.) launch almost instantaneously, providing a very smooth user experience.
More Responsiveness for AI, Less RAM for Apps
This strategic choice comes with a clear trade-off. Out of the advertised 12 GB of RAM, only 8.5 GB are actually available for the operating system and user applications.
For everyday use, this is still more than adequate. However, for “power users” who frequently run multiple demanding apps or games, this limitation might be noticeable. This is a significant shift from the Pixel 9, where only the Pro version (with 16 GB of RAM) had this kind of partitioning.
If you’re not interested in the AI capabilities on your Pixel 10, there is still a way to unlock the RAM. To disable the AI Core service on your phone, go to Settings, then Apps, and select All Apps. In the top right corner, tap the three dots and select Show System. Then find AI Core, open it, and disable the service.
What’s the Takeaway?
This technical decision highlights Google’s vision. By sacrificing some of the “raw” RAM to enhance the responsiveness of its services, Google asserts that the Pixel is now an “AI phone” before it’s a “smartphone”.
This approach to RAM management partly illustrates the philosophical difference with Apple. In Cupertino, the integration of hardware and software allows iOS to operate very efficiently with less RAM. To run Apple Intelligence, all iPhone 16 models come with 8 GB of RAM, deemed sufficient without needing to “lock” any memory, thanks to its unified and dynamic system management. Rumors about the iPhone 17 suggest an upgrade to 12 GB of RAM, not to reserve part of it, but to handle even more complex AI models alongside other tasks. This represents a more versatile approach (if confirmed). What’s your preference—more RAM for multitasking, or faster AI responsiveness?