The first post in this series discussed why you want OpenCL. The second post described how it works. This post discusses how OpenCL will affect your workflow.
Below are some “compute” examples of where OpenCL will impact the CAD workflow:
- Linear algebra
- Signal/Audio/Image Processing/Video
- Finite difference method app
- Finite-element solving and direct solvers
- Finite particle Method FPM and airbag simulation
- Constraint Solving
- Contact search / contact analysis for nonlinear simulation
- CAD modeling engine
- Boolean operation, interference and clearance calculation
- Model tessellations
- Hidden-line removal
- Graphics visualization and rendering
- Injection molding flow simulation
- Cloth simulation
- NC tool positioning and material removal simulation
- Robotics and plant automation with robot tool path planning
- Data sorting and database operations. See PostgreSQL with OpenCL.
The greatest impact for CAD and designer productivity will be workflows where there is a tight coupling between visualization and compute or optimization and visualization. Examples are simulation-based optimizations and design studies on full vehicles (from automobiles to aeronautics to yacht design).
The Holy Grail of Rendering: Real-Time Ray Tracing
I’m a visual guy attracted to shiny spherical balls that reflect the environment off of their surface, i.e., ray tracing. OpenCL is a formidable tool to accelerate any ray tracing application by at least an order of magnitude. To me perhaps the most interesting right now is Caustic Graphics and OpenRL (Open Ray Tracing Library), their standard for writing ray tracing applications that execute across heterogeneous compute platforms. OpenRL uses OpenCL to take advantage of any GPU in the system (add-in board or APU) to accelerate ray tracing.
As a note: Apple developed OpenCL (before submitting to the open standards Khronos Group). Apple is already a major investor in Imagination Technologies, which recently bought Caustic Graphics. My conclusion: it is only a matter of time before you see the benefits of OpenRL/OpenCL on iOS devices.
Author: Tony DeYoung
So let’s face it, we’re all somewhat lazy. In fact, there are arguments out there that, as a species, we are lazier now than at any point in human history. And, as long as we are being scientific, there is even a physiological explanation having to do with the “lizard brain” to explain why we are lazy. Whether any of that falls more on side of science or fiction, I don’t know. What I do know is that CAD is not really what you would call a full contact sport.
At least, not yet.
Mobile Technology Transformation
The past few years have been an amazing time in the worlds of electronics and communication technologies. It seems like just yesterday people were amazed when they got the “guess where I am” phone call from the driveway. Now phone calls are the least of what we do with our portable phones and laptops. Really, it is debatable if you can even call a phone a “phone” anymore and laptops are being quickly replaced with tablets and items somewhere in-between. But what effect is this technology is having on our busy world?
We are getting off our butts!
More and more you see people working on the go. Dads typing away answering the important email between little league innings. Mom may be out and look up her favorite band to stream some music or Tweet a fun fact. Now you can add “people working on CAD drawings” to the list!
Recent advances in virtualization technology from developers like Citrix and even specialized “mobile” CAD platforms from Autodesk and Bentley are liberating our drawing files! Our CAD files are no longer relegated to the desk or “mobile” workstation. Now your files, and thus your work, are just as far away as the nearest computer, tablet or even smartphone!
CAD professionals are finally getting out of the office because we’re not tied to the desk anymore. Now the work is mobile and so are we. Commuting costs can be cut down. Drawings can be edited in the field and drafters can finally start to get a tan and lose that pallid office-dweller complexion!
Free to Roam with CAD
So what does it all really translate into? Nothing less than a world where CAD escapes the four cubicle walls and goes on a fantastic adventure. Suddenly I see a world where CAD drafters are free to roam. Fears of not having the power of a workstation to do your work are almost a non-issue. Oh sure, you will still return to the super-powerful workstation to do the heavy lifting. Architectural renderings and large point clouds will, for the time being, remain the province of the king of desktops. But, the light lifting? Oh, we can do that anywhere!
France today, Milan tomorrow and the Orient after that! Wherever we go, our CAD work is there with us! Gone are the days of the 9-5 and the overworked coffee pot! I see a wide-open pasture, filled with wandering CAD professionals in natural habitats. They’re grazing on fresh air, meeting and helping one another and getting some skin color. The mobile CAD movement will free the drafter from his bonds and leave only the IT people behind to toil.
Hey, it can’t be roses for everyone.
Author: Curt Moreno
The first post in this series discussed why you want OpenCL. This post will describe how it works.
The GPUs in present day graphics cards like the AMD FirePro/Radeon and Nvidia Quadro/Geforce lines are massively parallel, multithreaded, multicore processors with enormous computational power and high bandwidth. Traditionally these multicore processors have been used for graphics processing, leaving the CPU to do everything else.
More Computing Power Using Massive Parallelism
The paradigm shift with OpenCL is a non-proprietary, standardized (and familiar) language to divide up general-purpose computational code into parallel threads so the GPU and CPU can work in tandem to deliver new functionality or tackle large processing tasks.
One of the key elements about OpenCL is its ability to allocate resources to the GPU or multicore CPU depending on how much power is needed and how data intensive any given task is. An OpenCL CPU+GPU-based solution means you can get simultaneously high performance for a design as well as its analysis and simulation.
In business terms, what OpenCL means is that responsiveness and speed from existing servers to handheld devices, will improve dramatically. When algorithms are redesigned to use OpenCL, speed-ups of 10x are common, and speed-ups of 30x are not unusual. (See, for example, EDEM Simulation Engine.)
Next I’ll discuss how OpenCL will affect your workflow.
Author: Tony DeYoung
Most CAD users don’t have any reason to be familiar with how graphics languages like OpenGL 4 and DirectX 11 actually work. All that 99% of us care about is that our CAD applications and video cards support the latest versions so we can benefit from high-performance 2D/3D rendering and visualization.
In some ways the new OpenCL compute language isn’t any different. You don’t need to know anything about the inner workings to use it. You just know you want your hardware and software to support it.
On the other hand, OpenCL is a disruptive technology that will jostle market leaders and significantly alter hardware price/performance ratios. So it is worth learning what it does, where it will have the biggest impact and how you can benefit.
Why Do We Need a Compute Language for the CAD World?
Answer: Increasing model complexity
- Nowadays automotive models can contain up to 50,000 parts with 10 to 20 GB of data. The number of triangles can reach 40,000,000 polygons/model.
- In the mid-1970s a typical model of an automobile chassis had 5,752 node points,
2108 finite elements and 28,924 degrees of freedom. Today, a typical model of
an automobile chassis has 12 million node points, 7.2 million elements and 35
million degrees of freedom.
- In 2009 a computational fluid dynamics simulation of a racing yacht design
required a mesh of over 1 billion cells.
Simply put, model complexity is growing exponentially and faster than the ability of
our desktop or laptop machines to easily crunch the data (without running as
hot as the core of a supernova).
Author: Tony DeYoung
Let’s play a simple mental game. I promise, it won’t take long at all.
I want you to pretend that I gave you two objects. One is a gallon container of milk. The other is a 3 ounce Dixie Cup. Next I will ask you a question “Is it possible for that cup to hold the entire contents of that container of milk?” Naturally you would look at me like I was an idiot. “No, of course not” would be the answer.
See that didn’t take very long and it was a pretty simple premise. It’s just common sense that a 3 ounce cup cannot hold a gallon of anything. Along those lines it is also common sense that you cannot fit an elephant on a dime, land a passenger jet on a stamp or even fit an eggroll into a keyhole. We all know that these things are nuts. Or do we?
New Software is Not the Only Cost of Doing Business
Every year in offices all over the world design professionals rejoice when the newest release of their chosen program arrives. There are new bells and even some new whistles. The glossy box is covered in slick graphics and promises of “faster rendering,” “improved materials editors” and whatever else it is that excites you. Of course the only reason this new software release is even in the office is because the boss wrote a check.
Why shouldn’t he? New software is pricey, but it’s the cost of doing business. Software improves every year (in some cases) and those improvements will help employees be more productive. The new bell will improve rendering and the new whistle promises to edit materials faster than last year’s release. Obviously it is a smart move to invest in the tools that a company provides its employees so they may be more productive. This, hopefully, will result in the company being more efficient and thus profitable.
And that is all fine and dandy.
However, too often it seems that those same bosses, those members of management that believe that annual software upgrades are a MUST, believe hardware lasts forever. Hmm … how is it that we have come to such an impasse? Why is it commonly accepted that yearly software upgrades are “the cost of doing business” but hardware should be permanent?
Better Software Requires Better Hardware
Every year developers work feverishly to release software that outperforms, outshines, and pushes out the previous releases. The claims that the features are greater in number and efficiency are normally true. These new releases are bigger, badder and make the last release seem slow and clunky. But there is a price to pay for those improvements in the form of greater demands on hardware. What does that mean to the end user? Basically it means you need better hardware to get the most out of better software.
Well that makes sense … in theory. But in practice the boss is griping that the company just bought ALL that expensive software. Thousands of dollars were spent, now we need new hardware? Last year the new software upgrades worked on the current machines. Unfortunately those machines were already a year or two old. Now they are even older! And now the software demands even MORE hardware power. Why are you trying to fit a gallon of milk into a 3 ounce Dixie cup?!?!
Hardware does not last into perpetuity. Upgrades in hardware must accompany upgrades in software in order to get the MOST benefit out of the investment in the new software. More capable software hobbled by inadequate hardware isn’t a case of “Oh well, it’s slow.” It is wasted company capital! That is money that is just thrown to the winds! In an age of difficult economic times, this is not acceptable.
Like anyone, companies have budgets and there may only be X dollars for upgrades this year. Well if X is all you have and the software on the current workstations is already not working to its fullest positional DON’T invest the budget into more software that will not perform to its limits! Use that money and upgrade the hardware!
Minimum Hardware Requirements Can Minimize Your Return on Investment
I proffer that an application that is one or two years old will provide better efficiency and performance on new “overpowered” hardware than the opposite scenario. Installing new software on old, underpowered or “minimum system requirements” hardware is a disservice all around. The user is disappointed because there are frustrating slowdowns and crashes. The boss is disappointed because the invested dollars do not show the returns expected. But, maybe most of all, it is the developer that is disappointed because their product, their hard work, is misrepresented. Good software on old, underpowered hardware gets a bad name. It’s that simple. Even if that hardware does meet the “minimum system requirements.”
The only thing “minimum system requirements” will get you is “minimum system performance.” How does it makes sense to literally spend thousands of dollars per workstation for new software and not hundreds to get the most out of that investment?
I know that we all get excited about new software releases. Those sexy animations and ad campaigns beckon to us. Meanwhile our workstations sit under our desk, hidden in the dark, working away quietly. Our workstations, in most cases, are not dazzling or attention grabbing. But without them, no amount of new software features will ever improve performance.
Author: Curt Moreno