Special Coverage

Home

Designing Flexible Printed Circuit Boards

General purpose flexible circuit boards are those with electronic components that can be bent to fit in tight spaces. Most are just one or two layers thick and are meant for “flex to install” applications, as they will tolerate limited flex cycles. Circuit boards like this are often found in a variety of medical and consumer products.

Posted in: Articles, Articles

Read More >>

Patient Monitoring Using Capacitive Sensors

Capacitive sensors can be used to measure many different physical parameters that are important when monitoring the health of patients. In particular, the technical advances made by sensor manufacturers in using micro electro mechanical systems (MEMS), fabricated using silicon microchip manufacturing techniques, have opened up new possibilities for integration which improve the ease and adaptability of their use.

Posted in: Articles, Articles

Read More >>

Bringing Open Source Advantages to Engineering

By Peter Schroer President and Founder Aras Corporation Andover, MA As a career engineer, the need to coordinate large-scale design activity and all of the associated information has presented an ongoing challenge. In my opinion, one of the most exciting developments over the past decade has been the mainstream adoption of open source techniques. In the R&D and engineering world, the open source trend is transforming how we work and is enabling new forms of collaboration. In the beginning, open source meant Linux. Not anymore. These days, open source has been widely embraced, creating high-quality software solutions for most purposes. “Enterprise open source” typically means that a solution is sufficiently robust to compete directly with conventional proprietary offerings on functionality and capabilities, and that there is a company standing behind the solution to provide enterprise-class support and value-added services. For example, several years ago, Aras released our flagship engineering data management PLM software as enterprise open source, making the solution freely available for collaborative development. This meant that document management, online change control workflows, FMEAs, program management, and other important engineering processes all became accessible for continuous enhancement. Now, organizations all over the world are using the system and contributing innovative improvements. Last year, a Lockheed Martin contribution embedded the corporate security protocols into the software while others added Earned Value Management and Resource Management. The open source format has led to internationalization with translations into a wide range of languages including German, Japanese, and Hebrew. Additional addons now include integrations to all of the major CAD and EDA systems like CATIA, NX, Pro/E, SolidWorks, AutoCAD, OrCad, PADS, and others. The 21st Century’s Internet enablement has changed what is possible, and now the global recession is driving these new approaches, like open source, throughout industry and government alike. The advantages of the open approach in engineering span both the technical and business spheres. From a technical perspective, collaborative development introduces new innovations with greater quality at a faster pace than previously possible. From a business standpoint, enterprise open source removes the license fees, thereby eliminating the capital expense and significantly reducing the total cost. The flexibility and control inherent in the open model represent a fundamental shift away from the restrictions imposed by the conventional software mentality, which forces complexity, cost, and risk onto the customer. With enterprise open source, packaged software solutions with commercial-off-theshelf or COTS features are freely available for use “as is” or for modification, integration, and extension at the user’s discretion. In fact, the ability to leverage and incorporate systems in place already makes secure access to existing data a compelling proposition. In other words, an organization can modernize without the rip-and-replace proposition. As engineers around the world solve the technical challenges of the new millennium, the techniques and tools that make this possible will be different. Together, our embrace of open approaches will be one of the factors that determines our success, and our collective innovation will be the engine that drives our future progress. For more information on Aras Corp.’s open source PLM software, contact Peter Schroer at pschroer@aras.com, or click here.

Posted in: Articles

Read More >>

Mechatronic System Integration and Design

While today’s multi-discipline mechatronic systems significantly outperform legacy systems, they are also much more complex by nature, requiring close cooperation between multiple design disciplines in order to have a chance of meeting schedule requirements and first-pass success. Mechatronic system designs must fluently integrate analog and digital hardware — along with the software that controls it — presenting daunting challenges for design teams, and requiring design processes to evolve to accommodate. What is Mechatronic Design? The growing trend toward mechatronic system design is driven by the same things that drive all technological advances: the demand for higher performance and lower costs. The word itself is a portmanteau of “Mechanics” and “Electronics.” As Figure 1 shows, mechatronic design includes a combination of (1) mechanical design elements (e.g., plant, actuators, thermal characteristics, hydraulics/fluids, and magnetics); (2) analog, digital, and mixed-signal electronics; (3) control systems; and (4) embedded software. The intersections in Figure 1 — (a) electromechanical sensors and actuators; (b) control circuits; and (c) digital microcontrollers — reveal the most common areas for interdisciplinary cooperation among mechanical, electrical, and software engineers. Best Mechatronic Design Practices Boston-based technology think tank, Aberdeen Group Inc., provided pivotal insight into the importance of incorporating the right design process and tools for mechatronic system design. In a seminal study, Aberdeen researchers used five key product development performance criteria to distinguish “Best in Class” companies, as related to mechatronic design. The results were fairly revealing (see table), and should be of significant interest within the extended design community. In the study, Best in Class companies proved to be twice as likely as “Laggards ” (worst in class companies) to achieve Revenue targets, twice as likely to hit Product Cost (manufacturing) targets, three times as likely to hit Product Launch Dates, twice as likely to attain Quality objectives, and twice as likely to control their Development Costs (R&D).1 The fact that the Best in Class companies performed better isn’t as noteworthy as the degree to which they performed better. Two to three times better on every variable invites the question, “How were they able to achieve these far superior results?” Aberdeen’s research revealed that Best in Class companies were: 2.8 times more likely than Laggards to carefully communicate design changes across disciplines. 3.2 times more likely than Laggards to allocate design requirements to specific systems, subsystems, and components. 7.2 times more likely than Laggards to digitally validate system behavior with the simulation of integrated mechanical, electrical, and software components. The remainder of this article will explore these “best in class” practices in further detail. Communicating and Allocating Design Requirements A mechanical engineer may be interested in dampening vibration by adding a stiffener. This, of course, would add mass and as a result, may impact how fast the control system ramps up motor speed, thus impacting size requirements on the motor as well as power requirements. The benefits of immediate, formal documentation of this design change enables concurrent, cross-discipline design. Effective partitioning of the multiple technologies present in a mechatronic system is another significant predictor of project success. Subsystem partitioning begins with a common-sense breakdown of the design, using Figure 1 as a highlevel framework. To the degree possible, separate out mechanical subsystems from electrical subsystems, and the same with controls and software. From there, subsystems can further be broken down into subcategories beneath the high-level distinctions, including, for example, digital, analog, and mixed-signal electronics; divisions in mechanical subsystems; and breaking out overlapping areas (e.g., sensors and actuators) as additional subsystems. Next, subsystems can be assigned to specific job functions and design groups, and input/output requirements can begin to be defined at the boundary crossings between subsystems.2 Figure 2 shows the partitioning process, moving from functional design through implementation. With this framework in place, the design and analysis can begin for each subsystem — later to be combined and analyzed as a complete system. Simulation and Virtual Prototyping In contrast to physical prototyping, virtual prototyping and system simulation allows a system to be tested as it is being designed, and provides access to its innermost workings at every phase of the design process (this is difficult or impossible with physical prototypes). Moreover, simulation provides for analysis of the impact of component tolerances on overall system performance, which is out of the question with physical prototypes. When employed early in the design process, simulation provides an environment in which a system can be tuned and optimized, and critical insights can be gained, even before components are available and before hardware can be built. After the basic design is locked down, simulation can again be em - ployed to verify intended system operation, varying parameters statistically in ways that would otherwise be impossible with physical prototypes. Subsystem and Component Modeling In order to create a model for a system, each subsystem and component in the real system needs to have a corresponding model. These models are then stitched together (as would be their physical counterparts) to create the overall system model. Using the Department of Defense-initiated VHDLAMS modeling standard (IEEE 1076.1), system integration can begin before physical hardware is available, including embedded software or any other domain that can be described using algebraic or differential equations. To be specific, VHDL-AMS allows expression of simultaneous, nonlinear differential and algebraic equations in any model; the model creator need only express the equations and let the simulator solve them in time or frequency domain. Domain knowledge from any engineering discipline can be encapsulated in reusable libraries that are accessible by any member of the design team. The art of creating these models, and knowing exactly what to model and why, are keys to successful simulation. Some modeling include: Which system-performance characteristics are critical, and which can be ignored without affecting results? Does a model already exist? Can an existing model be modified? What component data is available? Several software simulators exist for simulating mechatronic designs (such as SystemVision from Mentor Graphics). These simulators support VHDL-AMS, SPICE, and embedded C code in providing an environment in which mechanical, electrical, software, and systems engineers can collaborate using common models and a common modeling environment3. In conjunction with proper mechatronic system-design training, careful interdiscipline communication, and deliberate system partitioning, simulation technology can play a key role in mechatronic project success. This article was written by Bill Hargin, Director of Product Marketing, System-Level Engineering Division, Mentor Graphics Corporation, Wilsonville, OR. For more information, click here. References Aberdeen Group, System Design: New Product Development for Mechatronics, Boston, MA, January 2008. (www.aberdeen.com) Scott Cooper, Mentor Graphics Corp., Design Team Collaboration within a System Modeling and Analysis Environment, 2004. (www.mentor.com/systemvision) Ashenden, G. Peterson, D. Teegarden, The System Designer’s Guide to VHDL-AMS: Analog, Mixed-Signal and Mixed-Technology Modeling. San Francisco: Morgan Kaufman Publishers, September 2002. (www.mkp. com/vhdl-ams)

Posted in: Articles

Read More >>

Slip Clutches Solve Diverse Design Problems

Slip clutches are commonly used to protect against overloads, but they can solve many other problems as well. Their applications include increasing machine speeds, applying constant tension to webs or wires, indexing a mechanism, holding a hinged object in position, controlling torque on capping or assembly operations, and providing soft starts or cushioned stops.

Posted in: Features, Articles

Read More >>

Taming Residual Bulk Image in CCDs

Residual bulk image (RBI) is a phenomenon observed in certain types of front side-illuminated charge-coupled devices (CCDs). A CCD is an electronic light sensor used in digital cameras. In simplest terms, the sensor exhibits a memory of prior exposures resulting in ghost images appearing in subsequent images. This deferred charge can cause a number of problems in cooled long-exposure scientific applications. At a minimum, the ghost images can create the illusion of a non-existent object (Figure 1, left). Equally serious, they can lead to significant errors in quantitative measurements required for photometric applications.

Posted in: Articles, Features, ptb catchall, Photonics

Read More >>

Product Simplification: Rediscover a Whole New Game

By John Gilligan President Boothroyd Dewhurst, Inc. Wakefield, RI Product simplification is the discipline of merging the greatest performance functionality into the fewest number of parts using the most suitable and cost-effective materials and manufacturing processes. It is an engineering board game, in a way, answering questions about a design and seeing a Design for Manufacture and Assembly (DFMA) database respond with quantitative costs and reports. There is truth and mystery in confronting an analysis that says there are too many parts, shows the team where, and then launches everyone into the intimacy of trial-and-error engineering, collaboration, and fresh creation. It’s a game that companies would ideally play regularly, but tend to do most vigorously when innovation and efficiency are both in crisis. Cross-functional product development teams have discovered and rediscovered the phenomenon of product simplification in meeting door- die cost targets for their companies. Along the way, manufacturers learned that it is through the rigorous combination of design and process innovation that market desirability and engineering elegance are achieved in tandem. Yesterday’s innovative design ideas and process choices are today’s competitive standards. Snap fits and living hinge techniques became great tactics for innovation by Dell, HP, and Motorola. Beyond plastics, engineers made other daring moves from the DFMA game board. Medical companies embedded hydraulics and printed circuits into structural supports to avoid individual part costs, potential part failures, and added assembly labor. Dell and HP continued their design assault on unnecessary cables, harnesses, and separate electronic components, building new functionality onto circuit boards. Product simplification was helped, of course, by creative supply chains. Finally seeing an opportunity to advance new technology, suppliers showed their OEM partners how to use process breakthroughs to put answers on a sometimes blank work sheet. A designer’s habit, for example, of creating molded ribs for purely visual symmetry can add 30-40% to the manufacturing cost of a component. The expertise of both parties working in transparent collaboration with a cost analysis tool has unlocked significant savings. There are other catalysts for innovation as well. Motorola University in Asia teaches the integration of lean, Six Sigma, and DFMA to internal design teams, suppliers, and customers. They recognize the impact of product simplification on quality, performance, and profitability in electronic products. Recent benchmarks for cost reduction are impressive. Knowing that their ap - proach is a business, not just a technology strategy, engineers sit in redesign sessions with unit heads — even with presidents — and use a business score card to measure progress and institutionalize best practices. The benefits of product simplification are spread through every “touch phase” of a product’s travel — from the napkin sketch idea, through CAD, production, shipping, administration, service, and end-of-life disposal. Innovation — brought about through analytical costing and simplification of the complete pro duct, from initial design to final disposal — is the future. Wonderfully, the best industry innovators have already embraced a full understanding of the dynamic beauty of simplicity, but everyone can, and should, play this game. For more information on DFMA, click here.

Posted in: Articles

Read More >>