Intel and Nvidia show off how textures -- which take up a large chunk of PC games -- could be compressed to save you money ...
Intel is developing a new technology that can significantly reduce the size of game textures, helping save storage space and VRAM. This system works ...
NVIDIA researchers have proposed a neural compression method for material textures that enables random-access lookups and ...
NVIDIA showcases Neural Texture Compression at GTC 2026, cutting VRAM usage by up to 85% with real-time AI reconstruction.
A new compression technique from Google Research threatens to shrink the memory footprint of large AI models so dramatically ...
The artificial intelligence (AI) boom has been a powerful engine for the stock market, rewarding investors who targeted the companies building its foundation. Yet a sudden, sharp selloff recently hit ...
The biggest memory burden for LLMs is the key-value cache, which stores conversational context as users interact with AI chatbots. The cache grows as conversations lengthen, ...
Intel TSNC brings neural texture compression with up to 18x reduction, faster decoding, and flexible SDK support for modern ...
Neural Texture Compression (NTC) could be a game-changer on par with DLSS if it can reduce the VRAM requirement for textures ...
Nvidia researchers have introduced a new technique that dramatically reduces how much memory large language models need to track conversation history — by as much as 20x — without modifying the model ...
In its "Tuscan Wheels" demo, the company showed VRAM usage dropping from roughly 6.5GB with traditional BCN-compressed ...
A more efficient method for using memory in AI systems could increase overall memory demand, especially in the long term.