logo
logo

OpenStack's no science project, but does 'need to be glued together'

avatar
Steven Condon
img

Ten years ago, he said, most researchers worked with a high-end PC and hoped there was enough disk space to run their applications.

Now, Howard says, you run your compute and have your data stored in a national facility, you get more compute than you have in the local environment, and then you pull your results back across the network.

As someone whose history stretches back to Australia's early Internet, with involvement in early OpenFlow developments, and three-and-a-half years after the NCI first investigated OpenStack, Howard has an insider's view of OpenStack's effectiveness today.

That's a big thing for the NCI, because its scientific HPC is characterised by very big but relatively short-lived data flows, a large number of users, and facilities all over the country.

The drive to commodity into networking equipment, vendors offering low-cost switches, all combined with an SDN control plane, offer significant savings, he said, but it's the sophistication that's demonstrated the importance of SDN to the NCI.

Rather than being a painfully difficult environment to use, Howard said, the NCI's attention with OpenStack has been trying to balance which features we make available at what time, how we take advantage of those features, and how to train the researchers to take the best advantage of that environment.

collect
0
avatar
Steven Condon
guide
Zupyak is a free content platform for publishing and discovering stories, software and startups.