2011

Engineering in the Cloud: From Simulation to Storage

“Minimal management” means that cloud providers take on some management burden like maintaining and upgrading the hardware, backing the system, and providing APIs for programming, provisioning, and running applications. The management and maintenance of the cloud for users and programmers is easier because once updated, the cloud updates the user inherently, and depending on the layer of cloud technology you use, the cloud providers offload many of the IT, redundancy, uptime, and management issues. Additionally, innovation is happening that allows the cloud to manage itself in many cases through automatic routines and procedures.

Using the Cloud

altNow that we understand a little about what the cloud is in the academic sense, let’s explore its uses in engineering. Stemming from mainframes, one would be correct in assuming that cloud computing can tackle computationally complex problems by exploiting multiprocessing techniques. A high-fidelity simulation, for instance, that might eat up a single PC for days can be done fairly quickly in a correctly configured cloud setup.

On-demand storage is another popular use for the cloud. If you have infinite data storage and extreme computational power, why not just run entire applications in the cloud and access them from anywhere in the world? Even so, in some cases, there are still security concerns and lingering technical details that are limiting factors, but also sources of innovation.

The promise of cloud computing is the virtually unlimited scalability of the processor cycles one has at their disposal. Of course, not all intensive tasks can be parallelized, but in theory, the cloud represents infinite open lanes of instruction crunching.

In practice, companies are already seeing benefits. National Instruments is putting field-programmable gate array (FPGA) compilations in the cloud (Figure 1). In fact, the current beta program already allows users to simply select the NI Hosted Services compilation server and subsequently use the LabVIEW FPGA tools in the same familiar way (Figure 2). The only difference is that compilation happens on an optimized, high-RAM dedicated computer in the cloud rather than locally maintained servers, or worse, bogging down the development computer. The transfer of files and statuses is handled in the background on high-security Web service connections to a set of other cloud machines that handles authentication, license checking, scheduling, and of course, the specialized and computationally intensive work of an FPGA compile.

The cloud represents both something special and something familiar for engineers. They have been using server banks and mainframes since before Web 2.0 sold it out; however, they must pay homage, and indeed embrace, the never-before-seen levels of dynamic, on-demand, reconfigurable, ubiquitous computing resources of the cloud.

This article was written by Rick Kuhlman, NI LabVIEW FPGA Product Manager at National Instruments, Austin, TX. For more information, visit http://info.hotims.com/34453-121.

Reference
1NIST.gov - Computer Security Division - Computer Security Resource Center (http://csrc.nist.gov/groups/SNS/cloud-computing/Csrc.nist.gov)

White Papers

PICO Brochure
Sponsored by Nordson EFD
Changing Face of Robotics
Sponsored by Maplesoft
Managing Risk in Medical Connectors
Sponsored by Fischer Connectors
Avionics Reliability – Thermal Design Considerations
Sponsored by Mentor Graphics
Vision library or vision-specific IDE: Which is right for you?
Sponsored by Matrox
Laser-Induced Damage to Large Core Optical Fiber by High Peak Power Laser
Sponsored by OFS

White Papers Sponsored By: