PLM Connection

Companies are turning to PLM to improve global, cross-functional collaboration by establishing a 'single source of truth' for product data. To create a single source of truth, PLM champions typically face these challenges throughout the initiative:

This App, PLM Connection, is designed to provide quick access to valuable resources for PLM project leaders to improve their project preparation. Success factors like organizational communication, interaction with ERP, encouraging business involvement, and implementation methodology are covered. Take advantage of content within each section of this tool. It is organized as follows:

The earlier you are in the PLM project, the more valuable this information will be for your team.

Copyright and Trademark Notice

This application and its contents are the property of Mercury Marine, a division of Brunswick Corporation, and are protected by copyright, trademark and other United States and international intellectual property laws. You may not use, reproduce, download, store, post, broadcast, transmit, modify, sell or make available to the public content from this application without the prior written approval of Mercury Marine.

Best Practices

Copyright and Trademark Notice

This application and its contents are the property of Mercury Marine, a division of Brunswick Corporation, and are protected by copyright, trademark and other United States and international intellectual property laws. You may not use, reproduce, download, store, post, broadcast, transmit, modify, sell or make available to the public content from this application without the prior written approval of Mercury Marine.

Organizational Alignment - Tier 1

We recommend taking several steps back before considering progress forward to focus on "Process" before technology.

Once the need for a Product Lifecycle Management system has been identified, the project champion must consider the total project landscape, including its impact on upstream/downstream business processes required for supply chain, manufacturing, quality, and marketing, as well as technical publications, portfolio planning, and service. The organization must commit capable resources to define these business processes ahead of any system development in the chosen technology.

"From my perspective, having cross-functional participation to define business processes and an understanding of the critical touch points brings significant rewards when trying to deliver a new product to market," said John Bayless, Director of Program Management and PLM at Mercury Marine. "Without this process clarity, contributors from across the business are not aligned."

An enterprisewide "vision" for the PLM project must be established by the primary stakeholders from each of the functional areas, in coordination with senior management, to obtain the high-level support required to fund the project, offer support, and provide organizational leverage to drive the implementation, as well as addressing the cultural-change needs.

Once the "vision" is established, upper management should start communicating to the organization through regular company channels, and by establishing a cross-functional project governance board to help manage resistance and provide clear cross-functional participation and priorities to the initiative.

After the PLM Vision has been formed and communicated, a disciplined approach of project on-boarding is used to establish end-to-end business processes through a series of workshops before unveiling technology. This approach is flexible and scalable and can be applied to any sized project, such as integration with Enterprise Resource Planning (ERP) initiatives, change management, and other system deployments.

"It's much easier to deliver PLM projects where the requirements are clearly identified, processes are well documented, and end-user expectations are managed," said Andy Miller, PLM Implementation Lead at Mercury Marine.

Tier One workshops provide a forum to gather executive expectations in terms of shortcomings, areas for improvement and expectations for the future state. Understanding the goals and expectations of upper-management early in the program establishes the project as a business initiative the company can support.

Organizational Alignment - Tier 2

Tier Two uses information gathered during Tier One as input, then brings together cross-functional subject matter experts (SME) to establish an agreed-upon future state with agreed-upon measurable metrics. This provides an opportunity for the business to focus on processes and customer needs instead of technology "features" and perceived "benefits."

There are other impactful considerations besides processes that must be defined during Tier 2, such as:

"Knowing the answers to these questions is critical to scoping and configuring a PLM system," said Bala Shetty, PLM Implementation Lead at Mercury Marine. "From my perspective, an implementation should not move forward without cross-functional alignment on these issues. Answers to these questions, in addition to process development, will help shape the future PLM system."

Once cross-functional business processes are established, the company is positioned to detail the necessary software modules and licenses. In addition, the process definition will help guide how many author and consumer licenses are required to enable the chosen processes. This information is critical because many leading PLM providers have functionality tied to tier-based pricing/licensing. Having a process-based PLM Roadmap also provides an opportunity to break the PLM licensing and system-configuration costs into manageable, forecasted pieces.

Organizational Alignment - Tier 3

Tier Three activities map the cross-functional processes defined during Tier Two against the chosen enabling PLM technology. Data sources, reporting requirements and system architecture needs are reviewed and a technical plan is developed to support the technology implementation.

For example, the PLM Vision established during Tier 1, and the "To Be" process defined during Tier 2 may lead to implementing core PLM functionality such as product configuration and change management. But once the core functionality is implemented, the Roadmap may specify that visualization, document management, and project support functionality will eventually follow. Rather than purchasing licenses and budgeting configuration resources for the entire project, a company can take a more affordable approach by only budgeting funds for what is necessary for each step.

"The ability to scale the deployment is available only if the organization first establishes its long-term PLM Roadmap and defines cross-functional processes before investigating technology," emphasized Andy Miller, PLM Implementation Lead at Mercury Marine. "Remember, processes drive business solutions, not software."

At this point, software/hardware purchasing and installation occurs, and the application(s) are configured according to the agreed-upon business processes. It is also recommended to define the internal and/or external ongoing administrative support required by the chosen process implemented, which will help establish a plan for growing a PLM support team.

SharePoint Project Sites

Many years ago, Mercury Marine implemented SharePoint as a platform to help provide a single source of truth for unstructured data such as test pictures, project team documents, test analysis reports, document templates, etc.

This platform has become even more critical with the globalization of engineering resources, suppliers, and partners.

From the perspective of supporting Mercury Marine's Product Lifecycle Management initiative, SharePoint is used in three capacities:

"SharePoint provides an ideal platform for managing project-related documentation and processes because it is flexible, highly configurable, and pretty simple for people to learn," said Carl Wendtland, manager, strategy & systems architecture at Mercury Marine.

Within product development process and support, several SharePoint sites are used to support Mercury Marine product development needs. These include:

One of the most impactful implementations of SharePoint has been within Mercury's product development process. This is largely because these product development programs touch almost every functional area within Mercury Marine.

The product development sites provide a consistent, common environment that enables collaboration within global, cross-functional product development teams. For example, one advantage the Parts Planning Application provides program managers is the ability to watch activity within their program's site via the audit trail that records who is making changes to the attributes, but also bill of material changes. Without having to call a meeting, the program manager can identify potential areas of concern, and contact those team members directly.

"By providing a global environment for collaboration, our product development teams are able to work more efficiently to deliver programs to market that provide value to our customers and shareholders," emphasized Wendtland. "SharePoint has become our product development community."

The final category of SharePoint sites within the Mercury Marine PLM fold are related to project management, including Agile-SCRUM processes, and system testing/validation. These sites are owned by the Engineering Information Systems team responsible for maintaining the global PLM implementation. Sites include:

These sites allow EIS team members working from remote locations to remain connected. For example, by allowing each contributor to update their own SCRUM board goals and story points, the weekly kick-off and wrap-up meetings can be held with remote participants providing input. Dashboards updated weekly on the SCRUM home page keeps everyone updated with team performance KPIs such as goal attainment, failure reasons, and workload. It also helps catch resource bottlenecks earlier so project teams can make the necessary adjustments.

"Adopting the Agile project delivery philosophy has helped drive communication within our PLM team so everyone knows what is going on, who is required to deliver a result, and how effectively we are executing to plan," stated Wendtland. "Using SharePoint to run our SCRUM board saves time and improves communication, especially for remote team members."

Business Process Ownership

"By embracing a process-centric approach, cross-functional owners define the business processes upfront," said Emil Kacan, Global Product Development Process and Systems Leader. "Bringing people together to describe how they work and what they deliver before discussing software will likely result in streamlined work with reduced complexities."

Another result of empowering the business to own their processes should be a movement away from organizational fire-fighting and toward becoming proactive knowledge workers. This should be a natural outcome of establishing flexible, robust processes.

Putting the business in charge of processes empowers them and increases their involvement with the PLM project, which also helps ensure that the deployment will succeed. Giving process ownership to the business user should yield continuous system improvements over time because users enthusiastically control their destiny.

"For this to work, there must be a fundamental understanding by all levels of the organization that these cross-functional processes are 'owned' by the business users, not IT or the technical group responsible for implementing PLM," emphasized Lenny Grosh, PLM Program Manager.

Once "To Be" processes and requirements are defined by the business team, budgets for purchasing licenses and configuration resources for the implementation can be established. At this point, the business should understand what modules will enable processes, as well as how many author or consumer licenses are necessary. With this information, and phased requirements, a scalable deployment is now available because the licensing and configuration spend can be budgeted.

After processes are defined, the business can create a cross-functional 'core team' comprising of one critical user from each functional area for each phase of the PLM implementation. This core team is ultimately responsible for final process definition/refinement, application scope, and managing customer requirements.

"Each core team member uses inputs from their functional organization to make final tweaks to the process," said John Bayless, Director of Program Management and PLM at Mercury Marine. "The core team must be empowered to make decisions for the entire organization within the bounds of their PLM project phase."

Organizational Communication

"Organizational communication must be a priority for success," said John Bayless, PLM Practice Director. "Keeping end users informed pays huge dividends in the long run."

Throughout the business transformation journey, communicating with the business as often - and in as many media formats as possible - is absolutely critical. Face-to-face discussions are always best. Use large group sessions and functional staff meetings whenever possible.

Travel to all the plants/facilities affected by the transformation. Based on our experience of rolling out PLM throughout our organization and for clients, the satellite facilities will feel included with the deployment and are more than willing to participate in process definition activities. This alone goes a long way to successful user adoption of the system.


To complete the project lifecycle, the business should be heavily involved with shaping the end-user training experience. Ideally, the business representative from each function will attend the training sessions held for their constituents along with the configuration team. This accomplishes two goals:

  1. The individual who shaped the solution is answering process-related questions from their functional user base instead of an implementation person.

  2. Core team members make excellent change communication agents within their functions, and across all levels of the organization... this is a key to implementation success.

Finally, trainers should consider travel to all remote sites for face-to-face interaction. If this is not feasible, make remote sessions as interactive as possible.

It is also recommended to define the internal and/or external ongoing administrative support required by the chosen process implemented, which will help establish a plan for growing a PLM support team.

System Deployment

As part of the process-definition approach, the business team should gather requirements for the PLM system while developing the processes and cross-functional touch points. It is generally recommended to involve a business analyst accustomed to managing conflicting priorities and needs, as well as requirements documentation techniques to facilitate these sessions.

In conjunction with the process owners, the analyst should use priority matrices to manage requirements and establish development phases, as well as obtaining requirement approval from stakeholders to proceed. It is recommended to use tools such as 5-why, fish bone, RACI, use cases, traceability techniques, etc. during requirement gathering to make the activity as objective as possible.

In addition to having traceable requirements, they should also be measurable for all levels of the business:

Finally, the requirements document should capture the proposed process and requirements in detail. It is critically important to provide clear, unambiguous communication to the implementation team and potential off-shore resources involved with system configuration and testing.

After the processes are defined, bring in the software configuration experts to review what the business has created and provide any technical feedback. Generally speaking, most experienced configuration experts can make the processes work within the PLM tool of choice.

However, there are some exceptions where customization may be required, or the PLM system simply cannot perform what has been requested. If this occurs, the business should weigh whether to adjust their process, or embark on the customization. Be aware the customization may make it difficult to install patches or upgrades for the core system.

Core team business involvement must continue during final PLM system configuration and testing.

"Ideally, the implementation team will embrace an Agile or Rapid Application Development (RAD) methodology," said Carl Wendtland, Technical Specialist - Global PD&E Systems Architecture. "These iterative approaches tend to provide a faster deployment through time-boxed work packages and increased business involvement during system configuration."

Quick team decisions and frequent prototype reviews are also characteristics of these methodologies.

During solution development, it is recommended that the Core Team holds organization-wide communication Town Hall sessions. Updates at functional staff meetings are also useful, along with Sharepoint/intranet sites as communication mediums.

It is also suggested to involve key business users from each facility/function impacted by the project in system-testing activities to ensure that future-state processes were configured correctly in the chosen PLM technology. These events, called User Acceptance Tests (UAT), are a critical milestone to the system development and should be treated with the same importance as customer-driven quality sign-offs.

"At each major step along the way, upper-level stakeholders should be kept informed on project progress, road blocks, and other relevant issues through Project Governance Meetings," cautioned John Bayless, PLM Director at Mercury Marine. "The PLM implementation team should also establish project progress metrics early on and use these key indicators throughout the implementation."

Following are some general guidelines for a successful PLM deployment:

Once the PLM system is implemented, business ownership must continue. All future system continuous improvement activities should involve business users. A sure sign of success is when the business initiates and leads those activities.

Master Data Management

In virtually every business transformation, critical decisions define the success, or failure, of the project. If the transformation includes the marriage of a PLM/PDM system with an ERP system, there is no greater, or more heated, debate within the project team than where items (parts) are mastered, then the way manufacturing BOMs are managed.

Our organization experienced such a storm during the formative years of our transformation not once, but twice. Round One decided item ownership. Several years later Round Two hammered out the manufacturing BOM/route/quality data strategy.

There are several reasons why these decisions are so traumatic for an organization. One reason is that business transformation provides an opportunity to completely rethink how an organization operates and, for most team participants, it is a "once in a career" event and a rare chance to be involved with such an endeavor. Everyone wants to "do what is best for the business," and understands the ramifications of making bad choices (it is likely the current-state system is a sub-optimized solution).

Another reason is the consultants/software vendors involved apply a significant amount of pressure to the business team. In some cases the software vendors are vying for a functional land grab because the more business processes/functions that are "owned"� within their system, the more increased upfront licensing and services fees, in addition to lucrative long-term maintenance costs, will materialize. In other cases the software vendors or consulting team may not want to tackle the integration challenges, or may not recognize that the PLM system has enterprise benefits. After all, PLM only holds engineering stuff, right?

To help shape the decision making process, we drew a philosophical line in the sand. If the data was transactional in nature, it was to be owned by ERP. Anything related to product design, process definition, or configuration was to be mastered in PLM. With that said, there are some areas of overlap, such as quality, shop floor data, and resource management.

Our business transformation roll-out strategy was also shaped by this approach, as we implemented PLM before ERP because, from our perspective, the item ownership and birth was the core information upon which all other systems and associated data are based.

"It was essential that the business understood where item information and associated meta data would be mastered before embarking on the journey," said Lenny Grosh, PLM Program Manager.

From our experience, if a signed-off document such as this is not produced prior to engaging the business transformation implementation teams, the potential for rework and business discontent is significantly increased. Our transformation team referred back to this signed document countless times during the crucial Tier 2 process discussions as a way to level-set the workshop participants and establish objectivity. The amount of emotion and angst around establishing these criteria cannot be understated. Any guide posts provided by upper management will only help the process-definition teams succeed.

These guide-post decisions also provide the system configuration teams with clear directions as they start setting up the processes and business rules within the chosen PLM and ERP tools. It also helps with business-communication sessions to explain how the upcoming systems will affect end-users.

Content in PLM or ERP?

This ultimately led to the decision to use PLM as the item master and push the item, core attributes, and ECN information to ERP. Significant discussion and thought went into this choice. Eighteen discussion points required definition and sign-off, while the PLM/ERP ownership discussion occurred. Some considerations that affected the outcome included:

The implementation team established cross-functional workshops and guided the outcome using our 3-Tier Approach (discussed in a previous column) as a common framework. To help level-set workshop sessions, discussions focused on attribute definition and a common understanding of the following terms:

Once the team separated attributes using these definitions, information was further categorized using the following questions:

The deliverable from the Tier 1 workshops was a cross-functional sign-off of these decisions at the VP/C-levels so the downstream business owners had guide posts to work from during process definition workshops (Tier 2).

When the time came to plan the manufacturing portion of our business transformation, we took a similar philosophical approach where, if the data was transactional in nature, it was to be owned by ERP. Anything related to product design, process definition, or configuration was to be mastered in PLM.

The planning team decided that the manufacturing bill of process (MBOP) will be mastered in PLM and integrated with ERP to establish the route. With that said, there are critical attributes required from ERP to correctly populate the MBOP for consumption in the transactional system such as org structure data (plant, cell, etc). Our planning team agreed that information must be provided from the ERP system as "read only" into PLM so the attributes required by ERP are automatically populated while the manufacturing engineer structures the MBOP.

Many organizations have endured this PLM/ERP journey with mixed results - sometimes resulting in complete system redesigns after the initial implementation. Keeping things in perspective, establishing upper-level guideposts, and regular organization-wide communication will significantly improve the probability of success and reduce costly rework and consulting overruns.

Product Change Management

What makes for an effective change management process? A critical first step, which is often overlooked, is a Change Authorization. Basically, it is an approval to begin a change. Too often, changes are implemented with no 'authority'. This leads to several, and usually expensive, problems.

Without a controlled methodology to collect the product change request documentation and assess it for cost and implementation impacts, the result for the manufacturer could be wasted engineering and manufacturing planning time, costly tool changes, and scrap or quality spills.

A manufacturer should establish an electronic workflow process to manage the Change Authority, which will help organize the efforts of the cross-functional workforce. This is even more critical if the change affects many diverse locations or plants such as machining, painting, assembly, etc.

"Industry best practices suggest establishing a change board to assess the change information to grant an 'approval' to proceed within the change request before any work is performed," emphasized Dave Heap, Change Management System Lead at Mercury Marine. "The source of change could be engineering, manufacturing, quality, or supply chain."

For design-to-build or design-to-manufacture businesses, it is recommended to include the product program manager on the change board. For build-to-print businesses, include the primary customer liaison on the change board. These representatives should have significant input into the change approval.

In addition, the product change management process helps keep people inside and outside of engineering informed as part adjustments occur. This helps people in quality, purchasing, and manufacturing work better with engineering and product managers to make the best possible product decisions for Mercury Marine.

"As a program manager, I have led my change board through some controversial changes." emphasized John Bayless. "For example, I had an engineer recommend a change on a cowl seal for an outboard engine without regard to the capital budget. Luckily, the issue was resolved at the change board when the chief engineer suggested an alternative solution and the manufacturing engineer supported it with a lower-cost manufacturing option. In the end, we were able to meet the functional requirement and stay within the budget. Without the change board, we would not necessarily have had the opportunity to collaborate or the time within the program schedule to find an alternative solution."

To help better manage the changes, there are three change classes, Class A, B, and C. Class A changes apply only to new product development programs during CV and DV phase. Once a program reaches the Production Validation (PV) phase, changes are handled similar to production-released parts as a Class B type, which affects fit, form, function, and product cost. In the case of a class B change, the change initiator is asked to present their change proposal to the program-led Change Board that will decide how to proceed based upon the information provided by the requestor. A change board typically invites cross-functional, and cross-plant representatives affected by the change request to meet weekly.

"Having a smooth change process for the new programs really helps keep the program under control during the design validation phase," said Bayless. "In that regard, the change process has helped us maintain our cost, quality, and timing targets."

Release Process and Configuration Control

In addition to managing the product structure, release process and configuration control are also important. The release status becomes increasingly important as the product design matures. For example, once the CV design has been established, it should be locked down and managed separately from the latest working design. Each design milestone (CV, DV1, DV2, PV1, PV2, and Production Release) builds should be locked down in order to manage and track data better.

One common use case for this design management philosophy occurs during build execution. For example, when the procurement group purchases the parts to build DV1 prototype, a locked version of the DV1 design must be created independently from the latest working, or evolving design. This ensures that procurement is buying the correct version of each part for the DV1 build.

"Failure to follow this philosophy results in total chaos because when procurement checks the only design (latest working) while it manages the supplier relationship, it is likely that the part has been deleted from the design, incurred a revision change, or been replaced with another part," said Bala Shetty, PLM System Lead at Mercury Marine. "The resulting confusion between engineering and downstream consumers such as procurement or manufacturing is likely to significantly slow down the continual design evolution of the product, which causes the development budget to bloat with cost and time overruns."

It also increases the likelihood of a costly mistake such as tooling cut against the wrong design revision, or ordering the wrong part revision that causes expensive build delays.

Product Change Workflows

While the change is being implemented by the organization, approvals and work acknowledgements must be gathered at each step. To best handle this information, a change workflow is used to capture the change authority and all associated revisions and approvals. As a best practice, deliverables such as final part design, manufacturing plans, tooling designs, etc. should be locked before providing them to supply chain or the shop floor.

Examples of this include workflows to manage process planning, tooling, programming, and supply base changes. This information must be associated to the overall change request to ensure traceability. Once each cross-functional participant has completed their portion of work, an acknowledgement should be provided to the workflow so all participants know how the change implementation is progressing.

To help ensure the change implementation is complete, a final verification step should be included within the overall change process. This final verifier makes sure the planning, programming, tooling, etc. have all been production released before hitting the shop floor. This approach reduces downstream issues and prevents costly quality escapements.

This change management framework is also effective in design, design and build, and build to order environments, and for all cross-functional processes including engineering, quality, cost collection, marketing, publications and service.

Before Mercury Marine implemented its change management workflow in Teamcenter, product change information was managed using three disparate systems with different process flows at each plant or business unit. Today there is one change process with a change board approval for all product-related change activity. Following is the change flow process currently deployed within Teamcenter:

Changes can be initiated by any person or function within the company, with review/sign-off participants at any global location. By consolidating product change into one system with a single, consistent process, the average time required for an approval has been reduced from 56 to 22 days. In addition, all documentation and product parts affected by a change are managed within the change authority. This makes change traceability available to any Teamcenter user.

"The business formed a team to design their optimum change process and created an interface mock-up," stated Dave Heap, Change Management System Lead at Mercury Marine. "The development team was able to take that information and configure it within Teamcenter so the business could evaluate their design within the system. After a few iterations, the business team signed off and the change system went live."

Cost Engineering

Starting with core elements of a new target setting process, Mercury focused on incremental improvements in each area of every program until a high-performing, repeatable process was established. The Target Setting Process elements include:

Once the elements are working together, product teams are provided early, accurate data in an easily understood format that enables better decision making throughout the product development process.

This is where PLM helps drive cost management. Implementing a standard Large Assembly Management (LAM) methodology with established, consistent product structure across all development programs will drive BOM accuracy in addition to cost validation and reporting.

"Having a common BOM management technique allows for consistent downstream consumption of the product bill to enable standard program management dashboards and foster an environment for cross-functional collaboration," stated John Bayless, PLM Director at Mercury Marine. "Cross-functional participation in weekly product reviews, along with easily accessed data management by all program team stakeholders are keys to our success."

For example, Mercury Marine uses a phase-gate review process for managing new product programs headed by a Product Acceptance Committee (PAC). Program managers are expected to present their program to the PAC at a pre-set frequency, and requirements vary to pass each phase.

The Gate 0 review requires a written program charter based on market factors without actual cost data to launch a product idea feasibility study.

"Program team members work together to establish a Target cost for the product using BDI (DFM/DFA), supplier estimates, current prices, and best estimates based upon documented assumptions," said Gordon Flores, Cost Engineering Manager at Mercury Marine. "Estimates are also derived from actual BOM and plant costs."

As the product gains definition and all parties agree to Target costs, they are locked in at the system level. Then the program manager returns to the PAC with more concrete details, the Target cost becomes written into the program contract at Gate 2.

From Gate 2 forward, the costs are closely monitored using Visual Management. Cost data is captured for each week from the program cost tracking tool by functional system.

"Having the design managers accountable for their system maintains their focus on cost targets throughout the program," emphasized Flores. "This would not be possible without the LAM and cross-functional access to information."

Cost engineers working closely with cross-functional contributors from design engineering, manufacturing, and supply chain are able to collaboratively solve issues that might prevent the targets from being achieved.

By working together, and using the Target Setting Process elements, program cost management is delivering results for Mercury Marine and its shareholders.


The interaction between program management, cost engineering, and how those work within a PLM deployment have been popular blog topics lately. Like many manufacturers, Mercury Marine has worked through challenges such as reconciling the CAD or PDM system design content with the ERP transactional data using Excel spreadsheets to manage new product programs.

While cost management is important to a program's success, it is just one facet. What often gets missed in all the fire-fighting, meetings, and organizational chaos is why the cost information isn't more easily available, or why it isn't populated sooner during a program. Plus, all time devoted to manual or semi-automatic data reconciliation takes away from the opportunity for cross-functional innovation.

Using the Large Assembly Management methodology as a foundation for managing the engineering bill of material allows other systems outside of Teamcenter to consistently access product structure and improve collaboration.

Effective product data management provides Mercury program teams with the foundation for improvements with product costing, change management, part reuse, and configuration control.

For example, the SharePoint-based Parts Planning Application (PPA) provides product teams with the ability to receive BOM structure changes automatically, along with attributes from Teamcenter, ProE, and ERP. Cross-functional team members can populate all program-related attributes using the Mercury Marine developed .NET application while having read-only visibility to all necessary design and ERP attributes.

The PPA was developed to make product cost and cross-functional attribute management easy. The Mercury Marine developed .NET application is hosted within SharePoint. The application automatically transfers the product structure and relevant engineering attributes from Teamcenter, as well as ERP attributes into a spreadsheet-like interface within SharePoint. This provides global cross-functional contributors a robust environment to manage their role-specific attributes while having read-only visibility to information mastered in other systems. This application also enables faster program decisions and information accuracy across all product development phases.

"By having the bill of material managed within ProE, then saved into Teamcenter and flow into the PPA for cross-functional use and reporting, the product teams avoid all the manual reconciliation and data population issues associated with an Excel-based solution," commented John Bayless, director of program management for Mercury Marine. "Users can easily pick which functional level of the bill to report against, interrogate, and simultaneously contribute to from any location, at any time globally."

Information is provided by all cross-functional contributes and allows everyone on the project team with instant visibility to the latest available information. This includes critical program metrics like product cost or weight roll-ups, or whether an important attribute like a drawing release date has been provided so purchasing can begin long-lead part procurement.

By doing this so early within the design concept stage, cross-functional teams can immediately start determining make/buy parts, establishing target costs, and identifying long lead time parts. Parts requiring tooling or special manufacturing processing are also identified and tracked early.

Program managers can open a single application (PPA) in SharePoint and know exactly how their program is performing. It also allows for more efficient team sessions because the meetings become more about providing status updates on progress, rather than resolving issues because the team members have proactively collaborated ahead of time.

This has the following benefits:

"This cost management strategy involves relying on information from ERP for current part costs, then using an aggressive quoting strategy for new parts," stated Gordon Flores, cost engineering manger. "As quotes come in the team populates the Quoted Production Price attribute within the PPA that is tracked by design revision. This allows the team to know which revision of the part was quoted, the cost provided at that time, and to rank the risk dollars of that quote."

For example, the Best Information Material Cost attribute, or the final product cost, is a calculated value within the PPA based the Quoted Production Price plus all applicable burdens stored in the PPA. The program team works toward having a firmed-up Best Information Material Cost by the end of the program Design Validation (DV) phase.

Finally, visual management provides consistent presentation of product cost information over time via weekly feedback of cost run charts for each functional partition using visual management powered by the PPA.

When combined, these cost management techniques provide product teams with early, accurate data in an easily understood format that enables better decision making throughout the product development process. Having access to information to manage issues at the earliest opportunity are critical.

Large Assembly Management

Even with well-established CAD design standards, bill of material structure is often overlooked. Structure can vary across projects, products, and platforms, which causes downstream user (outside of engineering) frustration when locating common components between products. Implementing a common large assembly management structure across similar products will solve these problems. Benefits of a well-organized large assembly management strategy include the following:

Large Assembly Management Objectives

Goals of Large Assembly Management

"To help achieve these goals, one recommended methodology is to have pre-defined Engineering Bill of material levels within the CAD assembly product structure," suggested Bala Shetty, Solutions Architect: CAD/CAM/PLM Systems at Mercury Marine. For example, Level 1 may be the product top-level item that contains the entire end-model product structure. Level 2 generally contains placeholders (phantoms) for major functional systems such as cooling, base engine, electrical, drive system, etc. Level 3 will typically contain phantom items representing major sub systems under each Level 2. Physical sub-assemblies or items that are ordered or produced typically reside from level 4-X.

"Implementing a well thought-out Large Assembly Management initiative will provide the organization with a common engineering bill structure for product groups, making it easier for downstream consumers to understand and reuse the design information", emphasized Shetty. "It also provides a consistent process for engineering to interact between functional design departments and roles because project participants understand how their work contributes toward increased efficiency for the entire project."

Reuse of modular designs on future new products or iterations is another benefit.

PLM Enables Collaboration

In addition to managing information to deliver the product to market, an organization also needs to innovate for its long-term viability. To innovate, people need to communicate, which enables collaboration that sparks the innovation.

The gap is in how to connect the people, processes, and systems to most effectively enable innovation through business intelligence.

As often happens, a galvanizing event occurred within the business to help provide the catalyst for the change necessary to push these connected solutions from prototype to global user acceptance.

In this case, a clean-sheet engine design was required to remain competitive in certain horsepower ranges. Recognizing that a potential downturn was on the horizon, Mercury's management agreed this development program must result in a modern, cost-competitive engine available for sale in near future.

"During the engine's development lifecycle, the company was forced to reduce its workforce due to the downturn, placing even more pressure on the organization to get the program ready for market," noted John Bayless, director of program management for Mercury Marine. "To help manage these challenges, the organization turned to the PLM team to provide a path forward. The new product program teams have benefited tremendously from these PLM practices," emphasized Bayless. "One of the most beneficial aspects of this PLM strategy has been the early involvement of cross-functional contributors in new product programs."

For a new program, the design team starts with a product structure based on the large assembly management template mentioned earlier. Then the designers start populating the structure with part numbers for proposed new parts, as well as adding all the existing parts they believe are required at that time.

This is where PLM helps enable cross-functional collaboration. The standard large assembly management methodology based on the functional partitioning provides a consistent product structure across all development programs to drive standardization.

"Having a common BOM management technique allows for consistent downstream consumption of the product bill to enable standard program management dashboards and foster an environment for cross-functional collaboration," stated Bayless. "Cross-functional participation in weekly product reviews, along with easily accessed data management by all program team stakeholders are keys to our success."

By combining the large assembly and cost management disciplines, program teams have been able to consistently perform at, or under cost targets despite a new product launch cadence of every six weeks for several years.

Product Data Management

"Effective product data management provides program teams with the foundation for improvements with product costing, change management, part reuse, and configuration control," stated Lenny Grosh, PLM program manager for Mercury Marine.

For example, product data management within Teamcenter allows users to know product configurations throughout a program's lifecycle. For example, designers can lock down product structure at each of our major development milestones such as CV, DV, and PV. This provides the ability to compare build structures as the program matures.

Product data management in Teamcenter also provides a single location for the cross-functional product development community to collaborate and exchange information. This enables downstream collaboration by enforcing standardization in both data structure and processes that ultimately allows Mercury Marine to introduce more complex products to market faster with a smaller workforce.

Because products are not just a collection of independent parts, they are connected systems intended to channel energy using the paths and mechanisms designed by the engineers.

"To achieve the highest level of system refinement customers expect requires collaboration from everyone involved in the new product development community," emphasized Bayless.

Another example of a connection is how our change management system in Teamcenter works closely with our part reuse strategy. There is a step to screen common parts during the drawing sign-off workflow, so someone can review a proposed common part design before it is accepted and classified.

"Our classification gatekeeper has final say in whether a new part can be admitted into the Teamcenter Classification database," mentioned Nate Hering, document controller for Mercury Marine.

Part reuse has many product program benefits for Mercury Marine. For example, every new fastener adds complexity to the procurement, quality, and manufacturing operations for the following reasons:

"Most important, the time spent re-creating an existing or similar part for lack of knowledge is time not spent refining an all-new design," noted John Bayless, PLM Director at Mercury Marine. "Wasting time on recreating common parts also takes time away from attending to product craftsmanship, which are the design details that really make a difference to the customer."

CAD Practices

As the PLM strategy evolved, it became critical that Mercury's CAD processes maximize data reuse opportunities downstream. For example, a Large Assembly Management (LAM) methodology allows designers to create standard product structure for new development programs. These structures establish an engineering bill of material structure based on design functional partitions, which is also how the engineering team is organized. By aligning the personnel with a consistent bill of material structure, the team achieves the following benefits:

Team members can efficiently consume accurate, 3D virtual versions of products using JT lightweight viewable files. These virtual product representations are used by tech publications, manufacturing, procurement, and program managers to more efficiently and effectively communicate information about a program or perform work faster and more accurately.

Effective CAD Design Standards

Efficient CAD design can make the difference between success and failure during new product development. If CAD models are inefficient and bloated, they are time consuming to open, and much more difficult to modify and use, unnecessarily extending the product development timeframe. In extreme cases, top-level models containing the entire design may not open at all.

"Downstream users may also be affected, because tooling design and CAM applications are typically based upon the engineering design models," stressed Bala Shetty, CAD/CAM Systems Lead at Mercury Marine. "It is also important to enforce the same CAD modeling best practices within areas outside of engineering that consume and develop CAD models to support production."

It is recommended to establish a policy that requires all new Product Development in CAD to be done using a common measurement system, whether Metric or English units.

We recommend that supplier models used within the product design should be in Pro/E format (native or imported) and contain all standard parameters and layers. The model should be solidified. The following sections reveal methods for handling supplier models:

Supplier Model in Native Pro/E Format

Supplier Model is Not in Pro/E Format

Best practices suggest that all supplier assemblies should not be used as assemblies. Generally, supplier items are treated as one item, though functionally as an assembly. It is advantageous to convert the supplier assembly to single part file, which is easier to handle and maintain revisions. There may be cases where we need to have an assembly model. The need for an assembly model should be justified by answering the following questions:

If the answer to any of these questions is "Yes", then individual parts are required.

Another consideration is how to manage manufacturing assemblies created for CNC Programming and Die/Mold design.

"Ideally, the teams responsible for manufacturing data will use similar methodologies and disciplines as the engineering team", said Shetty. "By providing a common framework based on standards and best practices, groups outside of engineering should be able to comply."

Developing standard CAD/PDM attributes for engineering and manufacturing use cases, naming convention, as well as developing a process map to enhance the communication between engineering and manufacturing to better manage design changes helps with the overall data management.

Teamcenter 8.3

For performance improvements, version 8.3 provides faster internal code and enhanced latency support for f4-tier architecture implementations. Another key benefit is the store and forward capability for remote locations that improves save time compared with other versions of Teamcenter.

Other functionality enhancements include a totally redesigned change management module. Improvements include real types for change objects, allowing many more options for making behavior changes. Workflow control such as the ability to have separate approve and reject paths to provide better user interaction, while multiple outcomes supports real-time processes. Previous versions of change management are now designated as "Classic Change Management".

Issues management is another improvement that helps capture product issues from different sources with the ability to promote them to formal change requests. There is also a tighter integration between schedule manager and change management. Improvements with the Service Oriented Architecture (SOA) for building Teamcenter interfaces are also included with version 8.3.

For users of Teamcenter 2007 UA or earlier versions, there are many significant improvements beyond what is mentioned within this article. For more information about functionality within Teamcenter 8, visit

Version 8.3 New Installation Considerations

Version 8.3 gives you the ability to use 4-tier architecture and provides performance up to 300ms latency without requiring multi-site overhead for installations. Four-tier architecture is easier to install and maintain than 2-tier installations. Plus performance with two-tier solutions degrade over 20ms of latency.

Other new installation architecture considerations include the support of either .NET/IIS or J2EE for the enterprise/web tier. Based on our experience, support for .NET/IIS is better than for J2EE.

Another consideration is with BMIDE template setup. Best practices suggest that multiple templates should be avoided whenever possible unless the sites are totally independent with very little data sharing. "Having a single template is ideal if both sites have the exact data model requirements", said Raj Sundaram, Solutions Architect. "If slight variations exist between sites, then a master template could be defined, with supersets of that template defined for each location. Generally, this is only a concern if a multi-site installation is necessary."

From a process perspective, item definition considerations are critical, and there is some new item uniqueness functionality to consider. Teamcenter 8.3 now allows uniqueness to be determined with a combination of item properties, such as item ID and type.

Version 8.3 Upgrade Considerations

Based on our experience, we recommend a two-phase upgrade process. Phase one is performing the system upgrade using as-is configuration (no configuration enhancements). Once the system is upgraded and stable, then introduce configuration enhancements as a phase two activity.

"This approach keeps the upgrade scope more focused and manageable, in addition to providing a new interface from the system upgrade plus additional functionality during the same go-live activity tends to overwhelm users," said Sundaram.

From an environment perspective, Mercury PLM Services suggests upgrading a sandbox environment first to learn the new version of Teamcenter and verify basic system performance. Then, upgrade a test environment and overlay the current configuration making sure to thoroughly validate the system within each environment. Finally, perform the upgrade on the live production environment once user training and final hardware updates are completed. Be sure to validate any customizations and third-party CAD integrations with the latest version of Teamcenter in both sandbox and test.

Another best practice is to take performance benchmarks before upgrading. This helps verify that performance has not degraded, or if it has, then the upgrade team has an opportunity to investigate remedies. It also allows the implementation team to provide the user community with functional expectations about the system, such as will it be faster or slower than the current-state installation.

During the upgrade process, we suggest establishing a central validation and issues-tracking location. Our team uses a SharePoint site to capture validation plans and testing issues, as well as assign people responsible for fixing the issues. This provides a tracking and corrective action verification mechanism for the upgrade and installation team Upgrades from Teamcenter Engineering 2005 are supported, as Siemens supports upgrading from the last two major releases (2007 UA and TcE 2005). This is an important upgrade consideration as Teamcenter 9 will only support other unified architecture versions only (2007 UA and Teamcenter 8).

If upgrading Teamcenter Engineering 2005 in a multi-site configuration, the ODS (Object Directory Services) must be upgraded first. Upgrading from a clone of production is also essential. End-user training should also be planned for Teamcenter Engineering 2005 upgrades as some of the interface changes are significant.

IT Best Practices

This article is intended to provide guidance for setting up the PLM environment, as well as suggestions for improving the relationship between the PLM configuration team and IT. The following four concepts are critical:

"By working closely with the PLM team and the business, IT can form a cooperative and productive alliance," said Brad Draeger, IT Systems Architect at Mercury Marine. "Providing guidance on hardware and network requirements based on industry best practices and the unique requirements of PLM systems is very important because the environment sets the foundation for the whole user experience."

Relationship Building

Since the PLM system is often not an IT-sponsored system, but carries the same critical priority as an ERP system, it may not receive the same visibility and support. By having corporate IT representation working with the PLM team, this will help ensure success when working through the road blocks and politics that often accompanies a large enterprise-wide deployment.

"As part of the corporate IT team, we have access to back-end systems and resources that an engineering-based PLM team typically does not," explained Draeger. "By being integrated with the PLM team, we can provide the infrastructure skills to properly size and support the hardware, while also being able to support PLM system patch and upgrade deployments. Working closely with the PLM team has proven to be a successful combination over time."

Teamcenter Multi-Site Consolidation

Many organizations using Teamcenter for a long period of time have had to implement multi-site installations because of performance issues across large geographies. With the latest versions of Teamcenter, version 8.3 in particular, supporting 300ms latency plus store and forward, system architects can now consider single-site implementations with similar or better performance than multi-site solutions. That sounds great, but how will the existing sites be consolidated into a single Teamcenter site?

"Before considering the site consolidation strategy, it is important to understand the reasons for consolidating," said Raj Sundaram, Solutions Architect. "It is a good idea to understand how much support is required to maintain the current multi-site installation." Start by tracking multi-site support requests by type, such as day-to-day issues, patches and upgrades, BMIDE changes, share and sync, and multi-site user training. If the day-to-day issues, share and sync, or multi-site user training categories cause significant support time on a regular basis, then consider consolidation.

Other considerations are hardware and infrastructure costs. Generally, the costs for maintaining a multi-site installation are three times higher than a comparable 4-tier installation. Cost considerations include servers, licenses, maintenance (hardware, OS, database), backup infrastructure, and skilled IT personnel.

"Once the support reasons, cost, and performance considerations are compiled, acceptable performance targets should be established for the system," said Sundaram. "Once those are established, it's time to start planning the site consolidation."


There are four key stakeholder groups to consider during the site consolidation planning process:

Stakeholder Concerns
Teamcenter team Technical team responsible for leading and configuring the system consolidation
IT Team responsible for hardware, OS, database, and backup infrastructure portion of project
Management Approves cost or resources for consolidation, helps guide implementation timing
Key users/Process owners Provide user perspectives like performance metrics, workflow considerations, etc.

Include ample time for data cleanliness in the consolidation plan. Before consolidation, consider resolving ownership problems using Multi-site Assist (MSA) to identify and fix issues. Depending upon the number of sites and data being consolidated, multiple iterations of the MSA may be necessary to properly extract, analyze, and fix data. Best practices are to maintain focus by concentrating on issues within the source site.

Consider using the data model approach for the site consolidation. The source model elements must be incorporated into the target data model. Consolidating a superset templates model is easier than two templates with the same name that have differences. Even if they are supposed to be the same, be sure to run the bmide_comparator to see actual differences between the two. It is also suggested that all Siemens and third-party template dependencies are verified. Using database_verify provides additional insights into site differences. Use care with type conversions during consolidation because they require mapping and can be very complex.

Strategy and Execution

There are three critical strategy and execution processes to consider during site consolidation - server relocation, target readiness, and groundwork.

Server relocation involves setting the relocated source server to the target location and converting all source clients to Source Server 4-Tier. This makes transfers easier as they do not have to go over the network. This is ideal but not required. Target readiness requires configuring the target BMIDE template with the necessary changes then deploying it, as well as configuring workflows, accesses, and other non-BMIDE configurations like organization. Groundwork involves transferring ownership of all items that have not been modified during the past six months, as well as identifying and cancelling active processes that are no longer needed. Identifying and cancelling active processes that are no longer needed is a critical step.

Next, iterated transfers of data not data not modified for 6 months, 3 months, and 1 month intervals should be performed, followed by starting to communicate with users on check-outs and active processes.

Best practices suggest a two phase go-live process. Go-live phase one involves transferring all items, except those in process, over a weekend. Other Go-live phase one steps include configuring source clients for both source and target sites, with the source site configured to only allow completion of active processes.

Go-live phase two includes cancelling or recreating (at target site) all active processes, transferring all items to the target site and then ensuring that the source clients only have access to the target site.

The final site consolidation step is system cleanup, which involves performing the following activities on all sites, including the target site:

"Performing a site consolidation requires careful planning and execution," suggests Sundaram. "Following suggested best practices and taking care during the consolidation should result in a successful outcome."

IT Infrastructure

Environment Strategy

As an industry best practice, it is important to have at least two robust environments. One for Test, and another for Production. However, we also recommend having a third environment for Sandbox.

Sandbox provides an environment where the PLM system is configured out-of-the-box so the PLM configuration team can experiment with new versions of the software, install trial licenses for new modules, and in general just play with the base system to understand how it works. Mercury Marine uses VMWare ESXi for snapshots in case rollbacks are necessary from failed configuration changes.

Once the new configuration is understood, the next step is to move it from Sandbox into the Test environment. From our perspective, Test should be a full refresh of Production. However, the Mercury Marine team has also refreshed two production sites within Test to allow for multi-site environment testing. The team uses VMWare for snapshots and rollback throughout Test as various changes are validated.

The Production environment should be sized adequately to allow for the rigors of moving and storing large amounts data required to support an enterprise-wide PLM system. In the case of Mercury Marine, that includes four Windows servers and one AIX database server along with a remote volume. There are currently four multi-site configurations, although several are likely to be consolidated in the future. The team uses Oracle database backups and VMWare snapshots for Production environment rollbacks.

Remote Site Support

When working with remote sites, relationship building between the corporate IT and local support teams is critical. For example, a Windows operating system was chosen for the remote sites because that operating system is most familiar to the local team. Both teams coordinated the initial installation and support guidelines going forward.

Off-shore Connectivity

To allow contractors and off-shore teams into the Production PLM environment, one straightforward solution is an SSL VPN connection to several remote desktop systems located in the data center. This allows the user to fetch data to a local system in the datacenter and then move it to his offshore computer to make the changes. Once the changes are complete, the data is copied back to the datacenter desktop and checked into PLM.

Requirements Management

New product requirement gathering is typically handled as a cross-functional event where contributions are collected by marketing and warranty from a VOC and specification perspective, then delivered to systems engineering for review. Systems Engineers start breaking the higher-level requirements into categories such as functional, system, performance, and regulatory, then assign technical parameters to measure their attainment.

There are several commercially available requirements management tools that help project teams organize, link, and track requirements from early VOC through engineering validation. One of these tools is the Requirements Manager add-on module for Teamcenter from Siemens.

The Siemens module began as a stand-alone tool called Systems Engineering. Mercury Marine and other Systems Engineering customers provided a list of enhancements to Siemens that included the direct linkage of requirements to engineering design items, along with the ability to manage the requirement information efficiently all in one toolset. Siemens took these inputs and created the Requirements Manager module for Teamcenter Unified Architecture, which was initially released for Teamcenter 2007 and continues within versions 8.3 and 9.X.

"The ability to link the design bill-of-material in Structure Manager with the requirements using the Requirements Manager module is a critical development for efficiently managing requirements in a complex new product program," said Nathan Hering, requirements management lead at Mercury Marine.

Clean requirement document importing is the key to getting started with Requirements Manager. Creating the initial requirements using MS Word works really well because it allows for rich content such as graphs, charts, pictures, and tables to be embedded. Just make sure the charts, graphs, or flow charts are not linked to source content (using OLE for example). The document import into Requirements Manager will fail. Instead, convert all OLE content to an image by using the "Paste As..." functionality within MS Word.

The outline structure functionality using the MS Word headings (Heading 1, Heading 2, etc.) will automatically create an indented bill-of-material structure within Requirements Manager. From our experience, to get the cleanest import of any document, the heading structure needs to be set correctly so the MS Word paragraph outline level corresponds with the level structure of the document.

Once the requirements are successfully imported from the MS Word document, we suggest associating the source document as a dataset under the specification revision. Refer to the example below. Notice that the MS Word document 'Requirement Level Structure' version A is attached as a dataset under the specification item.

One of the key benefits of Requirements Manager is the flexibility provided for adding attributes for the business users to contribute key information associated with each requirement. Any user can create views within Teamcenter with various attributes for display to efficiently edit values, or for export to many formats including MS Excel via a 'Live' integration. This 'Live' integration allows the attribute information updated within a familiar tool like Excel to be instantly stored back into Teamcenter.

Based on experience, it is recommended to keep attributes to a reasonable number because the performance of loading views within Teamcenter deteriorates quickly as the attribute population increases. Views containing a handful of attributes can help, but if a program manager requires a report based on a view of all the attributes, that can take a while to open if more than 15-20 attributes are included.

For example, one of our automotive clients that manufacturers complex vehicle sub-systems uses Teamcenter Requirements Manager to track requirements provided by their OEM customer. The client project team configured attributes associated with cross-functional, and cross-system teams to manage and disposition requirements based on their business processes.

Further, an unlimited number of templates can be configured in Teamcenter to link against MS Excel to create a flexible, yet robust reporting vehicle. Custom reports can also be developed using the Teamcenter ITK tool kit.

Establishing the linkages between customer requirements and testing requirements is part of the next step in configuration work within Requirements Manager for the client. Linkages between the requirements and engineering design are also part of that step. This can be very complex because the linkage mapping must support future reporting and requirement data management needs. Linkages are created within Requirements Manager using a functionality called Trace Linking.

"Trace linking allows traceability between requirements and design (CAD BOM) or documentation (specification or test plans) and other requirements (validation or regulatory)," said Hering. "For example, these linkages will graphically show the impact of changing a requirement upon other systems, documents, and requirements. This helps program teams stay aligned as changes are made throughout a project's lifecycle."

Product Data Management Aerospace

To help with managing the content from an OEM portal for example, a commonly accessible SharePoint site by all the plants could be created to store the downloaded content waiting for review. Attributes could be associated with the downloaded files such as customer, project, download date, etc. to maintain data segregation controls.

To help review the downloaded content for relevancy, a change board or product specialist can be appointed by project or the OEM to review the information in SharePoint and determine the level of impact.

Once the change has been initially reviewed and found applicable to one or more locations, the data should be imported into the PDM system and locked down from possible alteration.

After the data is imported into the PDM system, a cross-plant and cross-functional change board can further discuss the requested changes to understand the impact across the organization, along with potential cost and timing considerations.

The change board can be a paper-based system, or ideally, it would reside within a change workflow inside a PDM system. This way, the design data (CAD), manufacturing planning, quality enforcement, tooling, OEM change instructions, and specifications can be linked against each other and the change authority to provide the full impact of the change.

Relevant content such as spreadsheets for cost and timing analysis, vendor quotes, and other related information can also be attached to the change authority to provide the change board with everything required to make an informed decision.

Once the change board has reviewed the change and discussed any impacts to cost and timing with the OEM customer, the requested change can be approved for implementation by the cross-functional teams across all affected plants based on the agreed upon part affectivity plan.

By using linked content in the PDM system, the teams can work in a more coordinated manner. For example, the manufacturing planner can update routes and processes within the context of impacts to work instructions, NC programs, tools, inspection fixtures, etc. The quality engineer can participate in the manufacturing process review to ensure the quality inspection changes are captured and properly updated. Tooling designers correctly adjust jigs, fixtures, and gauges, while machine programmers revise the correct program version and know the latest version is delivered to the machine.

The final step in the change process within the PDM system is a verification step where all the "do tasks" are validated as complete and a production status is applied to the parts and change. This final validation step is often performed by a document controller or impartial internal auditor.

Since part and document traceability is critical in aerospace, if an electronic sign-off is not practical on the shop floor, the next best option is to collect the manual signatures from each production or quality step into a package and store it as a scanned PDF in SharePoint under a job identification number. Other relevant attributes could include date, customer OEM, project, plant, etc.

Following some of these suggestions may help reduce some of the noise and fire-fighting associated with supporting the parts manufacturing for a large aerospace OEM.

ROI Analysis

Consider the following costs to the enterprise when building the business case for the PLM implementation:

Within most organizations there are many other inefficiencies besides those mentioned above that can be considered when justifying a PLM implementation.

Copyright and Trademark Notice

This application and its contents are the property of Mercury Marine, a division of Brunswick Corporation, and are protected by copyright, trademark and other United States and international intellectual property laws. You may not use, reproduce, download, store, post, broadcast, transmit, modify, sell or make available to the public content from this application without the prior written approval of Mercury Marine.

Presentation Outlines

These sample outlines are intended to provide guidance for PLM project leaders. Adjust the outlines based on project or organizational needs.

Copyright and Trademark Notice

This application and its contents are the property of Mercury Marine, a division of Brunswick Corporation, and are protected by copyright, trademark and other United States and international intellectual property laws. You may not use, reproduce, download, store, post, broadcast, transmit, modify, sell or make available to the public content from this application without the prior written approval of Mercury Marine.

Talk With Us