Intel's introduction of its Sandy Bridge chip technology in January was a breakthrough for the computer industry. Suddenly, transferring HD video between devices wasn't an arduous affair and the chip came with security protocols that were enough to impress even jaded studios – who agreed to begin offering HD versions of film online the same day they hit DVD and Blu-ray – a first for the entertainment industry. Ibm-pcm-memory

To quote Bachman Turner Overdrive, though, you ain't seen nothin' yet.

Both Intel and IBM are working on new chip technologies that could once again shake up the chipset business in the coming years.

Now, look, I realize chipset technologies aren't exactly sexy – but they're the guts that make everything else move forward. And what IBM and Intel have in the oven could let SPFX houses and other members of the filmmaking community take a quantum leap in their field.

Let's start with Intel, which is currently hard at work on its next chip, codenamed "Cloverview". Planned to launch in conjunction with Windows 8 in the back half of 2012, this isn't a leap forward for desktop machines like Sandy Bridge was. Instead, the company is targeting tablets, netbooks and other smaller portable devices.

Right now, those systems tend to lack processing beef. But it's hoped that Cloverview will put them more on par with mid-level desktop systems.

Meanwhile, development work is already underway on "Ivy Bridge," the successor to the current second-generation Core processor. Those, reportedly, will support DirectX 11 graphics (the new highpoint for PCs), USB 3.0 and several other bleeding edge technologies. They could be out as early as the first half of 2012.

At IBM, the advancements are perhaps even more seismic. Engineers there have announced the invention of a new type of memory that reads and writes 100 times faster than flash. On top of that, its lifespan is significantly longer than flash. That could lead to radical advancements in everything from media players to cell phones to DVRs to enterprise storage systems.

What does it mean for the average computer user? How about systems that boot up instantly? The ability to transfer data at speeds that make even the fastest chips today look downright pokey?

And, for Hollywood, it might even speed up the CGI process, cutting the development time of animated films and FX-intense films.

The tech, which is called phase change memory is a bit further out, unfortunately. It'll likely be another five years before it sees the light of day. But it's always kind of exciting to know when a paradigm shift is looming.

Filed Under:

Follow @Variety on Twitter for breaking news, reviews and more