Hard drives, and SA-SCSI drives especially, face growing competition from a new breed of storage device: the solid-state drive (SSD).
An SSD stores data in solid-state memory — that is, SRAM chips — rather than on conventional hard disk platters. Today’s SSDs are large enough to be useful, and although not exactly economical, have come down enough in price that they can enter the conversation when it comes to outfitting a new workstation.
The advantage of SSDs? There are several, including less noise and better reliability in the face of environmental issues like vibration. Unlike the HDD, the SSD has no moving parts. But the real motivation to choose SSD is performance. More specifically, it’s about much lower latency, the time that lapses between asking the drive for data and receiving it. The SSD doesn’t necessarily offer a big benefit over hard drives in bandwidth — how quickly the data comes once it starts coming — but it eliminates the seek time for the hard drive’s head, delivering an indisputable advantage in access time. The downside is a glaring one: price.
Given the pluses and minuses, CAD users who have a slightly higher but not unlimited budget can entertain the option of SSDs in one of two ways. A combination of HDDs and SSDs in multiple drive bays — in particular, a smaller SSD with your OS installed paired with a large conventional disk drive for data — is very practical. Or choose a hybrid drive that combines the best of both worlds. This emerging technology is effectively a two-tiered memory device that implements its bulk storage on the cost-effective hard disk while implementing a much smaller, but much lower-latency cache on SSD. For frequently accessing reasonably sized chunks of data, you get the speed benefit of SSD without breaking the bank. Whereas an SSD currently commands ten times the price (or more) per gigabyte of a conventional 7,200-RPM HDD, the hybrid drive is a relative bargain at approximately twice the price (although the premium and the performance boost will vary by model).
The bottom line on selecting storage: Buy a lot more than you think you need, especially if you’ve chosen a system that limits you to one or two drive bays.
I recently read an article by an Intel product manager on the need for “ECC” (error correction code) memory in CAD workstations. From the article: “Corrupted data can impact every aspect of your business, and worse yet you may not even realize your data has become corrupted. Error-correcting code (ECC) memory detects and corrects the more common kinds of internal data corruption.”
For some reason this triggered my memory of the sudden-acceleration Toyota Prius incident from 2010. The popular press latched on to the idea that cosmic rays were screwing with the electronics in the Prius. While theoretically possible, the probabilities of this were astronomically low. It did however, make for a great story and the FUD (fear uncertainty doubt) caused Prius prices to temporarily plummet and sales come to a crawl.
Back to ECC memory and CAD systems. Is there really a need for ECC memory in CAD or is it just FUD marketing to upsell hardware and make products sound more valuable than they really are? I decided to do a little research.
Who needs ECC memory and what is its role in professional & CAD workstation computing?
Naturally occurring cosmic rays can and do cause problems for computers down here on planet Earth. Certain types of subatomic particles (primarily neutrons) can pierce through buildings and computer components and physically alter the electrical state of electronic components. When one of these particles interacts with a block of system memory, GPU memory or other binary electronics inside your computer, it can cause a single bit to spontaneously flip to the opposite state. This can lead to an instantaneous error and the potential for incorrect application output and sometimes, even a total system crash. However, the theoretical chances of a single bit error caused by a cosmic ray strike on your PC or workstation’s memory is fairly rare — only about once every 9 years per 8GB of RAM, according to recent data.
ECC technology — used as both system RAM, and in devices such as high-end GPUs — can reliably detect and correct these errors, reducing the odds of memory corruption due to “single bit errors” down to about once every 45 years for 8GB of RAM. Of course, just like everything else in life there are always tradeoffs. ECC memory is typically up to 10% slower and significantly more expensive than standard non-ECC memory.
Because the odds of a cosmic ray strike increase in direct proportion to the physical amount of memory (and related components) inside a computer, this is a real concern for large scale, clustered supercomputing and other environments where computing tasks often include high-precision calculation sets that can take days or even weeks to complete. In the case of supercomputer clusters, which often contain hundreds or even thousands of connected computer nodes and terabytes of memory, the odds of cosmic ray strikes on the system are much more likely — and much more costly. Restarting a week-long calculation on a supercomputer can cost a facility many tens of thousands of dollars in lost time, electricity and manpower —not to mention lost productivity.
But for even very beefy PC CAD workstation configurations with loads of RAM on board, you are probably not at imminent risk from problems caused by cosmic ray strikes and the resulting single bit errors. Over the course of your work, you are much more likely to endure system crashes or application hangs dues to failing components, power fluctuations and software bugs than due to cosmic ray strikes. Additionally, many applications in the desktop design and engineering space can actually endure a single bit error without negatively impacting the computing process or product. For example, if the color or brightness of a single pixel on a display monitor is changed due to this type of memory corruption on the system’s GPU, nobody will ever see or notice it. There are many such examples of this type of error not really impacting ones everyday work.
This said, many leading technology manufacturers are enabling their high-end products with ECC memory for compute-heavy (especially clustered supercomputing) applications where the benefits of using error correcting memory outweigh any comparative speed/cost drawbacks. AMD for example, has engineered their new AMD FirePro W9000 and FirePro S9000 ultra-high-end GPU cards to include ECC memory which can selectively be enabled by the end user and used for many advanced computing purposes where rock-solid stability and protection from space rays is crucial.
Author: Tony DeYoung
Data-sensitive environments are no longer limited to defense agencies, power generation firms and related contractors, but include any enterprise that places a high value on its intellectual property. These environments require specific security precautions to ensure the security and integrity of sensitive and confidential information.
Security breaches can occur when documents are not disposed of properly. This is not only limited to hard copy documents but also electronic data. Unfortunately, simply deleting files isn’t enough. Threats to your data and secure information linger long after you delete a file.
The disk drive that is part of a large format printing device’s controller is used as a temporary repository for spooling and processing data (e.g. copy, scan and print jobs). These drives are susceptible to data remanence – the residual representation of data that remains even after the data is deleted – which can inadvertently make sensitive data available to unauthorized users.
The Importance of Electronic Shredding
Proper disposal of electronic data stored on a large format printer’s disk drive is imperative to preventing inadvertent disclosure of sensitive or confidential information. To prevent this from happening, specific security precautions should be integrated into all network devices.
Purchasing a large format device that is equipped with electronic data shredding (or e-shredding) functionality can help prevent recovery of previously printed, scanned and copied documents. With e-shredding technology, data is overwritten in such a way that makes it impossible to retrieve or reconstruct it. Print, copy and scan jobs sent to a large format print system enabled with this technology are completely overwritten and erased upon completion of the job. This can be particularly useful in decentralized walk-up environments where many different users have unregulated access to the system.
Electronic Shredding Options
Most systems that offer e-shredding functionality allow the administrator to select from a number of overwrite algorithms. Common algorithms used in the United States include:
- Gutmann: All jobs on the system are erased in 35 overwrite passes. An overwrite session consists of a lead-in of four random write patterns, followed by 27 specific patterns executed in a random order, and a lead-out of four more random patterns.
- US Department of Defense 5220.22-M: This is generally regarded as the highest level standard for sanitization to counter data remanence. It meets U.S. Department of Defense requirements for erasure of disk media.
- Custom: The system administrator defines the number of overwrite passes manually.
Implementing e-shredding in your large format workflow is an easy way to protect your confidential data. Don’t wait until it is too late and you security has been compromised. To learn more about secure data image overwrite technology on large format devices, read the whitepaper “Safeguard your Business’ Sensitive Data.”
Author: Bob Honn, Director of Marketing Services, Wide Format Printing Systems Division, Océ North America
- Enable planners, engineers, and designers to model existing infrastructure and import detailed models in order to create realistic 3D models of the environment;
- Sketch early-stage designs directly into 3D models;
- Create and manage multiple alternatives;
- Communicate visually rich infrastructure proposals; and generate preliminary design models which can be used to create submittal documentation in civil engineering software, such as AutoCAD Civil 3D.
In the following post we’ll describe how to use existing information to create compelling 3D design visualizations with MAP-21 (Moving Ahead for Progress in the 21st Century Act) requirements in mind.
If you are installing Autodesk Infrastructure Modeler for the first time, review the hardware requirements to ensure your hardware will run the software efficiently. (For more advice on the best hardware configuration for Autodesk software, review our series on AutoCAD 2013. Much of the same advice applies to other Autodesk products.)
Once installed, to create a realistic 3D model using Autodesk Infrastructure Modeler:
- Start Autodesk Infrastructure Modeler and click new from the start page.
- Choose a directory and name for your project. If you know the extents of your project you can also enter them in here.
- With the project started, data is imported and used as the basis for your 3D model. Autodesk Infrastructure Modeler allows you to combine 3D and 2D data in order to create a full 3D scene. For this post, we will use a terrain model (DEM) as our base 3D layer, and all of the other contextual data, like imagery, roads, and buildings come in 2D formats. Click on ‘Data Sources’ from the ribbon; on the ‘add file data sources’ dropdown, select ‘Raster’. After import this data source shows up in the ‘Data Sources’ panel. Double-clicking the data source allows you to modify the viewing properties of this data source. Click the ‘Close & Refresh’ button at the bottom of the configuration window to generate a 3D visualization in Autodesk Infrastructure Modeler.
- Add imagery using the same procedure.
- Use the same process to add roads, but use SHP as the Source Type. In this example, roads are stored in a 2D Shapefile. After import, double-click on the newly imported data source to configure it. Select ‘Roads’ as the ‘Type’ in the dropdown list. With ‘Roads’ selected you can now configure the roads style and other properties based on the metadata that comes with the Shapefile. For instance, you can choose a style rule to match the 3D road style (striping, sidewalks, median, number of lanes, etc.) based on existing metadata. Click the ‘Close & Refresh’ button on again to generate the 3D visualization.
- Lastly, we’ll add buildings to our scenes using the same procedure outlined in step 5. Select ‘Buildings’ as the ‘Type’ in the dropdown list. Since the buildings in this case are 2D footprints, we’ll select an attribute with a Z-value (elevation or height) from the ‘roof height’ dropdown. Once again click the ‘Close & Refresh’ button.
Voila! You have just created a 3D model using Autodesk Infrastructure Modeler. You can use this model to sketch preliminary designs of new infrastructure which includes roads, railways, city furniture, water areas, and even buildings. You can also exchange information with Civil 3D – using the IMX file type – to maintain consistent data and context as the project is further developed. This 3D model-based approach enables you to deliver on MAP-21 requirements for 3D modeling and visualization, on infrastructure projects of varying scales.
Author: Justin Lokitz, Senior Product Manager, Autodesk.
The longtime, tried-and-true hard drive remains the backbone of a workstation’s storage subsystem, but a new breed of solid-state technology is pushing its limits. Although they share the same basic technology as their ancestors, today’s drives are much bigger, faster, and cheaper. Traditional workstation hard-disk drives (HDDs) primarily come in a 3.5″ form factor, supporting SATA or SA-SCSI standards.
Essentially the same models that ship in corporate and consumer branded PCs, SATA drives are less expensive, sometimes dramatically so. (A terabyte for $50, anyone?) Pricing increases with drive capacity and RPM, an indication of how quickly the mechanical platter can spin within the drive and therefore how fast the drive can read and write data. The least-expensive SATA drives support 7,200-RPM speeds, while the highest-performance options jump to 10,000 RPM.
The second HDD option, the SA-SCSI drive, requires a motherboard interface that is also compatible with SATA drives (whereas a SATA interface will not support an SA-SCSI drive). With SA-SCSI, you’ll get the option to move up to 15,000 RPM, but you’ll sacrifice capacity and cash.
The Choice Between Speed and Capacity
Whether you choose a SATA or SA-SCSI drive, you will generally face a trade-off between paying for more RPMs or paying for more capacity, because buying both can be costly. Most CAD professionals would opt for capacity and costeffectiveness, because running out of space or money is usually a more glaring roadblock than facing modest shortages of access speed and disk bandwidth. Many of us are paranoid about running out of disk space — and we all should be to some degree, because data piles up faster than we think it will. If this describes you, consider purchasing extra drive bays that bring more room to add drive capacity later — although you can always fall back on external drives to shore up capacity down the road.