Brajeshwar

2-min read

Ever wanted to take a Supercomputer for a test drive?

If you’ve ever wanted to take a supercomputer for a test drive, now is your chance. Solve that probabilistic analysis. Figure out some brute force code breaking. Conduct 3D nuclear testing simulations. Or, if you’re more cosmopolitan, do some Molecular Dynamics Simulations. No matter what your supercomputing needs, Cycle Computing will get you there.

Check this out, it’s brilliant: Cycle Computing is a 20-employee company leveraging the cloud computing movement by timesharing virtual supercomputers out to small companies and individuals who would never have had access (read: funds) to such technology. They use virtual clusters by virtually lashing together 50,000 processors from Amazon Web Services, in the cloud, via their own software.

Cycle Computing’s timeshared virtual supercomputer software was recently used by two small drug companies, Schrodinger and Nimbus Discovery, to perform a series of simulations on over 21 million chemical compounds in an attempt to find a binding protein. By using Cycle Computing’s software on timeshared Amazon data center procs the two companies were able to run 12 and a half years worth of calculations in under three hours.

For a cost of less than $5,000 an hour. “This enables small companies and any researcher that has a grant to do science that they could never do before,” says Cycle Computing CEO Jason Stowe.

None of this is new of course – just impressive – as Amazon Web Services’ cluster comes in 42nd in a world ranking of the top 500 supercomputers. What IS phenomenal about this is the speed and power now available at such a low price; a technology generally reserved for governments and rich Saudi oil sheiks. Even big name pharmaceutical companies haven’t been able to process that quantity of information at those break-neck speeds.

“Today, any biotech startup can access something that would have cost $10-15 million to build yourself just a few years ago,” says Stowe, demonstrating the power of the product. “Last summer, we spun up a 30,000-core supercomputer cluster for a pharmaceutical client. We ran it for eight hours, using it for drug discovery, and then shut it down. The bill to the client was $8,500.”

This marks a major shift in supercomputing as ‘supercomputers’ are finagled across multiple data centers, leveraging tens of thousands of processors virtually to empower start-up corporations with the technology that puts them in the big leagues. This little-ization of so-called ‘Big Data’ is both a playing field leveler and a game changer.

← Prev Next →