Tuesday 20 August 2013

Introduction to Teamcenter Customization

 Teamcenter provides variety of mechanism for customization of teamcenter based on business requirement. The customization is based on base framework of API provide by teamcenter. In this blog I will discuss all customization options and its architecture.
Customization Architecture
Teamcenter customization architecture can be broadly distinguished based on Teamcenter technical architecture. It can be categorized in to three layers.
  • Server or Enterprise Layer
  • Web Layer
  • Client Layer
Client Layer is basically portal or thin client customization which usually deals with UI and data handling of the result of server request.  SOA client is SOA API for calling SOA services. You can see in detail of Teamcenter SOA service in my SOA blogs. Web Layer is nothing Teamcenter based J2EE deployment layer which basically communicate between Server and client.  Server customization is core of all customization as most of the Business logic is written in this layer. It mainly deals with all business transaction as it interacts with data base through Persistence Object Layer (POM) API.  FMS is resource layer which support actual file transfer between client and server through FMS framework. For more detail in FMS you can visit my blog on Teamcenter FMS. Server customization is done through C based API provided by Teamcenter. This API is also called Integration Toolkit (ITK). Apart from above discussed customization there is SOA customization and BMIDE extension which are basically either server and client\web customization or both.  Below diagram depict Customization Architecture diagram for Teamcenter. As shown in diagram, all BMIDE extension is in server side. This is because most of BMIDE extension overrides or changes object behavior based on business requirement. This can be only accomplished in server layer; hence all extension is implemented by using core ITK API provide in server layer. Below diagram shows the Customization Architect of Teamcenter.


Based on above Customization Architect, Teamcenter customization can be categorized in to following area.
  1.     Server Customization
  2.      Portal Customization
  3.     Web or Thin client customization
  4.     SOA based customization
  5.     BMIDE extension customization
Server Customization: Server side customization is a most frequently used customization, as all business logic are written in this layer. Basically all requests pass through through server layer for all teamcenter transaction. Hence it is core of teamcenter customization. As discuss in Customization Architecture, Teamcenter provide C based API called Integration Toolkit (ITK) for server side customization. This toolkit provides hundred of API for processing various business process based on Teamcenter functionality. The ITK is categorized by various modules and functionality of Teamcenter. Also various extension mechanisms are provided by ITK toolkit to plug in custom code based on various Teamcenter events and object status. The detail discussion of ITK customization is out of scope of this blog and I will cover it another blog.

Portal Customization: Teamcenter Client is layer is written on Java JFC and eclipse SWT. The core client API are written in Java JFC framework and slowly it will ported to eclipse SWT framework. Presently Teamceter support both JFC and SWT customization, but it is recommended to do customization in SWT looking at Teamcenter future vision. The Portal Customization can be done extending OOTB Plug-in or developing your own plug-in. Apart from JFC/SWT UI api, the Teamcenter client API also provides object interface component which is an encapsulation of Teamceter Data model through Client object model. This Object Interface component also form interface layer between client and server.

Web or Thin Client Customization: This customization is for Teamcenter Web client. Teamcenter provides standard web interface for viewing and editing Teamcenter object in web browser. Web client is builds on asynchronous JavaScript and XML (AJAX) to allow dynamic loading of data in the browser. The HTML pages are renders by java script on XML data. Most of the thin client customization is carried through JavaScript which allow the rendering as well managing request\response from web server. Both client-to-server requests and server-to-client responses in Teamcenter thin client are standard HTTP responses.

SOA Customization: It is also called Teamcenter services. It is a standard SOA based services provided by Teamcenter for integrating with third party as well custom client. Also Teamcenter provides framework to create your own custom SOA services. I covered Teamcenter SOA services in detail in my SOA blogs.

BMIDE Extension Customization: This is mainly a server customization using Teamcenter BMIDE. BMIDE provide various extension customization mechanisms for implementing desired behavior in Teamcenter. Some of example of BMIDE extension is pre-action or post-action operation for Business object, Runtime property etc. This extensions are implemented in BMIDE environment by writing C\C++ server code mainly using ITK API. BMIDE framework create stub code and other require classes for extension implementation. Developer only required implementing base logic of the extension.  I will try to cover extension implementation in one of my future blog.

Apart from the above customization; Teamcenter Dispatcher module can also be customized for required translation behavior. Most of time Dispatcher client required to be implemented for extracting and loading translated file from Teamcenter. The Dispatcher Client Framework is based on Teamcenter SOA service and most OOTB SOA API is used apart from Dispatcher API which encapsulates most of complex Teamcenter SOA API calls.

Source: http://teamcenterplm.blogspot.in/

Thursday 15 August 2013

Aras 9.4 Release Highlights

Aras Innovator Recent Highlights

With a 10X increase in speed, 10X reduction in memory and 6X smaller XML / AML package size, the latest release of Aras Innovator enables us to deliver best in class multi-CAD management performance.

The newest release of Aras Innovator® features a series of CAD-focused enhancements enabling Aras to deliver best in class multi-CAD management performance. The latest release of Aras Innovator also includes expanded support for the Microsoft platform and the enablement of secure managed file transfer (MFT) capabilities called TRUaras from Aras partner Trub iquity, embedded in the Aras Innovator PLM platform. Companies with an Aras subscription package receive free upgrades to the latest release regardless of the amount of customization in the current deployment.


BEST IN CLASS MULTI-CAD PERFORMANCE

10X increase in speed, 10X reduction in memory and 6X smaller XML / AML package size make Aras Innovator the fastest PLM platform for Multi-CAD Check-in / Check-out. New Check-in / Check-out Manager capabilities streamline and automate the check in / out process for entire CAD assembly structures. With a rapid status check for an entire CAD structure, a single compressed metadata commit and high-performance parallel, asynchronous file check-ins for each level of the CAD structure, Aras users realize greater reliability and faster performance.

SECURE FILE EXCHANGE

Secure File Exchange leverages Aras’s connected cloud capabilities to enable managed file transfer from directly inside Aras Innovator, providing a secure way to transfer files and conduct regular data transactions while adding encryption, tracking and traceability. Secure File Exchange replaces disconnected multi-step procedures, such as FTP, and non-secure methods, such as email and consumer file sharing sites, to enable users to safely and confidently collaborate from within the Aras PLM environment across the extended enterprise with customers, suppliers, outsourcing partners and contract manufacturers.

TRUaras– Secure Managed File Transfer

TRUaras embeds Trubiquity’s advanced managed file transfer (MFT) inside the PLM work flow so that users can securely share and exchange large CAD files, technical data packages, manufacturing data, inspection and test results, and other intellectual property with full tracking and traceability. TRUaras is ideal for companies that need to exchange large sets of files with suppliers, such as CATIA, Creo and NX assemblies, or for companies that frequently transmit product designs such as PDX files to contract manufacturers.

TRUaras Highlights:
  • Powerful - Allows global supply chain partners with thousands of users to securely manage transfers of large, complex sets of file
  • Seamless - Embedded in the Aras PLM platform for an integrated user experience and maximum productivity
  • Performance - Industry leading transfer speeds with high availability and uptime
  • Encryption - Endpoint-to-endpoint security capabilities via authentication and multi-layer encryption including 128-bit SSL, 256-bit AES (Advanced Encryption Standard) and 1024-bit private key
  • Traceability - Complete audit trail of all file transfers including the date & time, companies and individuals that conducted file uploads/downloads
  • Visibility - Track and monitor file movement in real-time from pending transfer status to post transfer receipt
  • Compliance - Support for Odette OFTP2 compliance and export compliance such as ITAR and EAR (Export Administration Regulations)

EXPANDED MICROSOFT PLATFORM SUPPORT

Designed to take full advantage of the latest Microsoft platform products for the scalability and high availability required by global enterprises, Aras Innovator is enabled on Microsoft SQL Server 2012, Windows Server 2012, Windows 8 and Internet Explorer 10.

ADDITIONAL RELEASE HIGHLIGHTS
  • Silent Installer for Virtual Machines, Server Farms and Cloud Environments
  • Check-in / Check-out Manager automated conflict detection
  • Feature Activation for Easy Evaluation and Enablement of New Applications
  • Multi-language Installer Capabilities
  • New User Personalization & Configuration Capabilities
  • Drag & Drop files from Browser to Windows Folders
  • Expanded Multi-Level Table of Contents Navigation
  • Improved One-Click Excel & Word Data Export
  • Affected Items Impact Matrix Improvements
  • New Visual Indicators in the Grid Display
  • Ability to Execute Multiple onSearchDialog Events
  • OnBefore / OnAfter Passthrough Capabilities
  • API XML / AML Compression

Wednesday 14 August 2013

Solid Edge Productivity Summit 2013

Experience the latest in design and design management capabilities

Dear Customer,

Solid Edge ST6 will soon be announced and this event is your chance to get a sneak preview of the latest capabilities of this award-winning design system. Solid Edge developers have been hard at work to help you design and manage designs better.

Some key enhancements of Solid Edge ST6 are:
  • New tools to engineers take part modeling to new levels
  • New simulation capabilities to automatically improving fit and function
  • Enhanced visual tools for managing design projects
  • Import entire projects from competitive systems into Solid Edge
We would like to invite you and your design team to a special event to show you the latest enhancements in Solid Edge ST6 and provide you a hands-on experience of how you can design better.
  • Bangalore on August 27 & 28, 2013
  • Ahmedabad on August 29, 2013
  • Vadodara on August 30, 2013
  • Pune on September 2 & 3, 2013
  • Gurgaon on September 5 & 6, 2013 

 

Tuesday 6 August 2013

Evolution of PLM

From frustrating file management to the convenience of cloud, as technology has advanced over time so too has PLM.

PLM was once a humble, paper-based system that tended to be slow, fragmented and a little disorderly, making innovation and product development a huge challenge. Think of your current PLM system but every document, spec and drawing is on paper. Change management meant walking around with an inter-office envelope collecting signatures and revision control was pretty much nonexistent.

Eventually, PLM became rudimentary desktop software. This improved collaboration and speed, but collaboration was only possible internally. Each location was an island with an independent system. Reuse and efficiency rates were in the basement and communication was spotty at best.

With the Internet, PLM went global. Locations all around the world - Paris, Mumbai, Shanghai and Houston - were finally connected in a way that allowed for easier collaboration. What Group B knew no longer depended on Group A's ability to share it - they all knew (and saw) the same thing.

The latest tech boost for PLM is, of course, the cloud. Today, not only are companies global, individual functions are too, with engineers in different time zones collaborating on the design of the very same component. In the cloud, PLM is free of desktop installations and other limitations, enabling true real-time collaboration and sharing of big data. And customers and suppliers are incorporated right into product development and innovation processes. With the cloud, speed increases, innovation increases, and profits increase. Who doesn't like that?

Monday 5 August 2013

Solid Edge ST6: Splines and Keypoint Curves

Splines are an integral tool when it comes to swoopy product design. For many users who are accustomed to more prismatic designs, splines can be a frighteningly uncontrolled way of working. ST6 has added some methods for controlling and evaluating splines.

The first enhancement I’d like to talk about are the Curve and Polygon Edit Points. The curve edit points are the points that are directly on the curve that you can move around to change the shape of the spline. ST6 has colored these red so you can distinguish them from the polygon edit points

                You can control these colors using the Solid Edge Options > Colors, shown below.

Notice that the polygon edit points are always on the convex side of the spline. When the “polygon” (it’s only really a polygon when the spline is a closed loop) switches sides, the edit points change colors. So you’ll notice that by default the edit points on one side of the spline are light blue, and on the other side they are dark magenta. If all of your polygon edit points are on the same side of the spline (which only happens if the spline never flips convexity), they will all be the same color.

Some people prefer to work with splines only using polygons, some prefer to use the curve edit points. Most people who have a lot of experience with splines prefer to use the polygon points. I personally use a combination, depending on the type of edit I want to make. If I need the spline to go through a specific point, I use the curve points. If I’m trying to change the curvature of the spline, I use the polygon points. One of the nicer new functions of the polygon edit points is that they can auto-attach to other geometry when you drag them.

You can make relations between control points as well, by selecting the tool on the Relate group, then clicking on the spline itself, and then selecting two control points to relate to one another. Another item  that may not be obvious is that you can apply a driven dimension directly to the spline that displays the local curvature with a radius value. As you move the dimension along the spline, its value will change to reflect the local radius. This is not for length, or distance between control points, but just for instantaneous curvature at any point along the spline. It is a read only dimension, changing its value will not change the shape of the spline.

You can also use the CommandBar to control a lot of options of splines. To get this to show up, just select the spline itself. Notice that the control points are not displayed if the spline is not selected.

Key Point Curves

Ok, let’s call a spade a spade here. If the “curve” feature is really a 2D spline, the “key point curve” is really a 3D spline. Identifying things this way just makes it easier for me to know when to use which tool. I’ve been learning a lot about the software the last few weeks, and hope to continue to learn more. Most of that learning has been by making mistakes. One of the things I had to learn was that Synchronous Key Point Curves cannot be edited, while Ordered Key Points can be edited. This throws a wrench in my plan to put all sketch data in Synchronous mode, if I plan to use key point curves.

And here’s another tidbit about key point curves. If you just start sketching a KPC plinking down points in blank space, those points will all lie on a plane parallel to the screen orientation. Let that sink in a little. This may be the only time when the geometry you create depends on the angle from which you view it. And that plane will change if you rotate the view, putting a bend in the middle of a KPC with a few points made from one point of view and a few points made from another. Of course, things will also change if you click on a “key point”, which snaps that point of the curve to the key point. This is one of those things that might require a more detailed post later on.

Key Point Curves also have several enhancements in ST6. You can now create curvature continuous end conditions on the KPC. If the curve goes the wrong way (180 degrees from the tangency direction you want it to go), you can put a negative sign in front of the weighting number. The weighting number is also new, called a Numerical Magnitude Control. The handles at the ends of the curve shown in the image below allow you to change the end condition (when the KPC is attached to another curve or edge at the end) between Natural (no control), Normal to Face (in 3D), Tangent, and Curvature Continuous. Key point curves can be used for a number of things, mainly any time you need a 3D curve.

What are some things you have used key point curves for?

Thursday 1 August 2013

Solid Edge ST6: Holes, Threads, and Pattern Recognition

In order to talk about Hole and Pattern Recognition, I have to first talk about Hole features, and we might as well pick up Threads along the way. You can find the Hole icon on the Ribbon on the Solids group. The Hole icon has a dropdown that includes Hole, Thread, Slot and Recognize Holes.

Holes

The Hole feature creates Simple, Threaded, Tapered, Counterbored, or Countersunk holes as features. These are editable as holes in both Synchronous and Ordered modeling. If you are used to some of the other CAD packages out there, ST6 does not have a list of standard screw sizes from Machinery’s Handbook. Solid Edge requires you to enter these values on your own. You can save all the various sizes you use frequently to reuse them so you don’t have to enter them a second time.

Threaded holes, however, do come with a list of inch and metric thread sizes, and callouts are added to the feature. For example, a 10-32 UNF hole is actually created with a .156” dia hole, which is close to the .159” tap drill size cited in my Machinery’s Handbook #27 (pg 1934). If this won’t work on your drawings, you can change the value in the Holes.TXT file found in your Solid Edge installation directory in the Preferences folder. Before you mess with this file, make sure to save a copy as a backup, and use comments to flag changes from the original. Be sure also to restart Solid Edge after making any changes to this file. The format of the data in the file is easy to understand and explained inside the file itself.

Changes you make to Holes.txt won’t update any existing holes, but new holes will use the new values entered. This file only covers threaded holes (and threaded protrusions). Other hole types (such as counterbored or countersunk) do not have standardized sizes assigned by Solid Edge, as mentioned earlier. (In case you’re wondering, yes, this would be a great point to add an enhancement request).

Threads

Threads can be applied to any internal or external cylindrical face using the Threads command (Home >Solids>Hole>Threads). Solid Edge will again use the Holes.txt file to size and classify the thread. A couple of tips are in order here. First, do not apply your own dimension to the cylindrical face, it will interfere with the function of the command. Second,  the cylindrical face for a female thread must be modeled to the internal minor diameter as specified in Holes.txt. The cylindrical face for a male thread must be modeled to the nominal diameter.

Hole Recognition
Hole recognition was added to Solid Edge in ST5, but I’m including it in this discussion with ST6’s Pattern Recognition because the two topics really go together. In order for Solid Edge to perform a Pattern Recognition, the holes that you want to put into a pattern must be in a hole feature first. So if you get an imported part with a pattern of holes and you want to edit the pattern or the holes, you must first run hole recognition, and Solid Edge will place the geometry in one or more hole features, grouping identical instances into a single feature. Then you can edit the holes, and change all of the holes in a single feature to a different type of hole. For example, change 10 identical counterbored holes into 10 identical countersunk holes. You can change any of the hole parameters, such as cbr depth, diameter, and so on.

After the hole is recognized and edited, you can recognize and edit the pattern or multiple patterns. You can find the Recognize Pattern tool in the Pattern group. For example, below is an imported part. Solid Edge recognized all 24 holes as a single hole feature, since they are all the same. Then Pattern Recognition saw two different patterns. I tried to get it to make a single pattern of two holes, but I wasn’t able to do that.

Pattern Recognition

If you are trying to recognize a pattern, and the icon won’t light up, one of two things is probably wrong. First, you might not have any hole features, so go through the Recognize Hole step. Second, you might not have anything selected. It’s best to select the hole collector (the entry at the top of the list of holes – this could be less confusing by using a different name or icon for the collector). If you select one of the holes listed under the collector, the icon lights up, but the interface for the function does not fully appear.

After the patterns have been recognized, the collector for this particular part only shows two hole features, and two pattern features. It shows two hole features because there is one seed feature for each pattern. It would be convenient in cases like the one shown here to have two seed features for a single pattern, but that may be an enhancement for a later release.
Summary

ST6's new patterning and pattern recognition are slick. Combined with last year's hole recognition, you've got some great tools for making native changes to imported holes and patterns. I'd like to put some standardized hole sizes on my wish list for the Hole feature for next year.


Wednesday 31 July 2013

Teamcenter POM Query

POM Query is one of important ITK  module in teamcenter from perspective on Data extraction and performance. Teamcenter POM query is API layer provided to query database through API rather then  direct query in database, as Teamcenter doesn't officially expose the underline database design. Developer often prefer to use POM Query rather then going for sets of ITK api to get the desired object from Teamcenter because of performance factor as well using one set of call for getting desired object. Once you understand POM Query mechanism it is very easy to implement complex query cases through it rather then going through lengthy set of ITK API calls . In this blog I will give basis of POM query. With this basic understanding you can build complex query through it. I am assuming the reader will have basic understanding of Teamcenter Data Model. If not please refer my previous blog on Teamcenter Data Model
Introduction 
POM query is noting but sql query which wrapped around ITK program for extraction of data from teamcenter database. I will explain  POM query through simple SQL example which we will convert to POM query. Let assume we want to extract some item based on specific item id and item type . If we want to do it through SQL query, the sql statement look like this
Select puid from item where itemid = “1234”  and object_type = “Item”;
So there are three main constituent of any sql query.
  •   Select (attributes)
  • From (table)
  • Where (condition)
    • And /OR

SQL statement  is a function, constituting above three aspect. If you want to covert the above statement in to POM query, all the above aspect formS the building block for POM query.
Following are basic characteristic of POM Query.

  • POM Query has unique identification.
  • POM query will have select attribute from  POM Classes
  • POM Query has expression which specified where condition
  • All expressions are binding through POM query API with logical clauses
  • POM query required to be executed to get the results
          Steps for building POM Query

  1. Create the query by unique name
  2. Add select attribute on POM query by defining attribute and corresponding POM class
  3. Build the  query with all specified expression\condition against query identification.
  4. Bind the expression through logical value .
  5. Execute the query and get the results.
  6. Delete Query
Let see how the sample sql statement can be converted to POM query

Create Query
Unique identification for query
POM_enquiry_create (“get_itemid”)
Teamcenter identify  query through unique string name in a given session. Hence it is good practice to clear the query after it is used.
Select attributes
const char * select_attr_list[] = {"puid"};
POM_enquiry_add_select_attrs(“get_itemid”, “Item”,1,” select_attr_list)
The above api set the select attribute agaist POM Class (It item in this case). You can have multiple attribute defined for select attributes in array and specified that in api. We defined 1 as we have only select attribute in our case.
Build Expression
const char * itemid[] = {"1234"};
POM_enquiry_set_attr_expr(“get_itemid”, "ExprId1", "Item", "item_id", POM_enquiry_equal, “valueId1”)
POM_enquiry_set_string_value (“get_itemid”, "valueId1", 1, itemid, POM_enquiry_bind_value )

The above set condition expression of the query. This is equal to item_id= ‘1234’. The expression is identified  by unique string  identification  which in this case is ExprId1. The value required to be binding through unique identified because of different data type binding. The value identifier valueId1 is then binding by value through proper API call based on attribute type to which it is binding. In our case binding is with string attribute, hence we call set_string_value api. If you have any other data type for attribute  then you have to  call appropriate API. Following data type are supported for POM Query.
Int  : POM_enquiry_set_int_value
Double : POM_enquiry_set_double_value
Char : POM_enquiry_set_string_value
String : POM_enquiry_set_string_value
Logical POM_enquiry_set_logical_value
Date : POM_enquiry_set_date_value
Tag : POM_enquiry_set_tag_value
This expression is binded by query by providing query identification which ‘get_itemid’ in our case. Similar expression will be for other condition of object type
 const char * itemtype[] = {"Item"};
POM_enquiry_set_attr_expr(“get_itemid”, "ExprId2", "Item", "object_type", POM_enquiry_equal, “valueId2”)
POM_enquiry_set_string_value (“get_itemid”, "valueId2", 1, itemtype, POM_enquiry_bind_value )
Expression Binding
Now the two expression should be combined for where clauses. The logical binding between expression is done through api call
POM_enquiry_set_expr(“get_itemid”, "ExprId3", "ExprId1", POM_enquiry_and, "ExprId2")
The above api will bind ExprId1 and ExprId2 with and clause. This is equal to
itemid = “1234”  and object_type = “Item”;
To identify the binding a new expression id is created. This expression id can be used now to develop complex binding if there are more then two condition clauses.
Expression can be binded by and, or and not condition. This is similar to sql condition binding.
Once the expression binding is completed, then we required to put as where clause in expression. This is done by calling API
POM_enquiry_set_where_expr(“get_itemid”, "ExprId3")
This set the where clause against expression ExprId3 which in binding expression for ExpId1 and ExpId2.

Query Execution
The above steps completes POM query which is now equivalent to SQL query. Now query required to be executed. Which is done by calling API
POM_enquiry_execute(“get_itemid”, &rows,&cols,&results)
Where row, col and report are output. 
rows : number of results.
cols : Number of column for each result
results : result of query in two dimension array. This is array of void pointer
The above binding can be better understand by below diagram.

Once query is executed and results are stored in array, they required to extracted and type cast for specific type based on select attributes provided for POM Query. For example is above case we extracted puid which is nothing but object tag. So we required to convert our output to tag pointer. Below psedo code shows how to extract and store it in tag array.
if(rows > 0 )
      {
            int reportrow = 0 ;
            tag_t *objs = Null Tag
            (objs) = (tag_t *)MEM_alloc( (objs), ( rows) *sizeof(tag_t)));
            for ( int i= 0; i< rows; i++)
            {
                  (objs)[i] = (*(tag_t *)( results [i][0]));
                 
            }
           
      }
Once results are stored after type cast then this object can be used as a any tag object id in teamcenter.

Delete Query
After executing the query and storing the result in appropriate object type we required to delete the query. Remember the each query is unique and identified through its string name. If we don’t delete the query, then query will remain in given state in a session and again if it hit same code it will trough a error as query with the given name is already registered in a session.
POM_enquiry_delete ( “get_itemid” )
That’s all for introduction POM query. Once you understand basic of POM query, you can implement various complex query by joining two tables and having multiple expression hierarchy. Most of the SQL statement can be converted to POM query. I suggest for complex query better to first visualize in term of SQL statement and then design POM query.
http://teamcenterplm.blogspot.in/

Teamcenter FMS Overview

File Management System (FMS) is one of the Teamcenter component for managing files or vault in Teamcenter. FMS is responsible for all transaction related to files from Teamcenter server and client. In this blog we will discuss the basic architecture of FMS and its interaction with Teamcenter Application.
FMS Overview:
FMS is independent tool which run as service in server (as FSC) and client machine (as FCC). Teamcenter Application Tier and Client Tier interact with FMS framework through HTTP or HTTPS protocol. The two components of FMS are FMS server cache (FSC) and FMS client Cache. As name suggest FSC is service running in server side which basically cache file in server and serves multiple user request where as FMS client cache work in client machine where it serve request for single user and also interact with FSC for getting latest or new files from server.
Architecture of FMS:
As discussed in FMS Overview, FMS has two components: FSC and FCC. For basic installation you usually have one FSC and multiple FCC based on number of user using the Teamcenter Client. Each of portal clients will have one FCC running on client machine. But in production Environment where user can be in multiple geographical location or number of user are so high that single FSC can’t service so many users. Also if volumes are mounted in different server then also we required FSC on each volume server as FSC is must for each of the volume server. Hence we required to have multiple FSC running in different server to server different geography or set of user or volume server. This multiple FSC server are distributed in such a way that they can be near to each of geographical location.  Due to multiple FSC server architect we then required to define one FSC server as master for managing request and routing to different FSC server. The below diagram shows FMS architecture.
FMS Configuration
Configuration of FMS is managed through xml files. Basically there are three types of Files
·         FMS Master
·         FSC
·         FCC
FMS master configuration file is master configuration file resides in master FSC server. FMS master configuration file which define various FSC sites in cluster or FSC Group. Apart from FSC information it may information of Volumes related to FSC. It will also have default configuration information for FSC and FCC which can be override by respective configuration
FSC configuration file is installed in each of the FSC server. FSC configuration basically contain two main elements
FMSMaster : Defines FMS master location from where FMS Master Configuration file can be read by FSC. FMS Master information help FSC to route the file request in case it doesn’t resides in it volume or cache.
FSC: Defined detail of installed FSC in server. In has different parameter which defines files transfer characteristic as well error and log information. Also it has parameter related to FSC cache for files as well cache location. The parameter vale basically decided based on load, file size, performance requirement as well overall FSC architecture.
FCC configuration installed in each client. It has two main elements
fccdefault : This override FCC configuration from FSC. This has various configuration parameter related to client cache and request.
parentfsc : This define FSC which FCC refer to for downloading FMS configuration. You can have multiple FSC defined as a backup for failover.
Communication Flow between FMS and Teamcenter :
Below is the process for communication between Teamcenter and FMS.
1.       User try to retrieve file from dataset.
2.       Whenever there is any request of file in teamcenter by user, application server forward the request to FMS for retrieving file from Vault.
3.       FMS create a FMS ticket corresponding to file retrieval from vault. FMS ticket is sent to client end which then request to FMS with FMS Ticket.
4.       FMS request is routed to FCC installed in client site for File retrieval.
5.       FCC check if the file cached in FCC and not modified. Modification check of file is done through concept of GUID which is associated with every file in Teamcenter. GUID is a business neutral identifier for file contents, to determine when to pull a file from its local cache. Every file in a Teamcenter vault has a single file GUID associated with every replicated copy of the file. Any change in File results in having a new GUID for the file. In this way FCC check for modification.
6.       If file doesn’t resides in FCC or changes, then FCC sent request to FSC associated with the site id. The priority defines FSC request sequence if the FCC is configured with multiple FSC for given sites id.
7.       FSC check if files is cached in its own server and belong to its own volume. Otherwise it will forward it to corresponding FSC. The other FSC site information its retrieve from FMS Master config file.
8.       FSC sent the file to FCC which in turn route it to client request.
The below diagram depict the overall flow of  FMS request.
Hope this will help to understand FMS working and configuration.

http://teamcenterplm.blogspot.in

Teamcenter Data Model

Data Model is core of any Packaging software. To have a good technical command in any package, it is important to have a good understanding of its Data Model. Teamcenter is no difference with it.  In this blog, I will explain basic data model of Teamcenter as well corresponding schema in Database. This will help people new to Teamcenter to have a better understanding of Teamcenter system.
Teamcenter Data model can be categorized in to three distinguish layer. They are
·         POM or Schema Layer
·         Business and Relation Object Layer
·         Business Rules
POM or Persistence Object Model is lowest layer, which basically represent mapping for underlying Data Base of Teamcenter. It is not always one to one mapping, but closest to DB Tables for most of classes. Developer should know detail aspect of POM layer for customization and extension of system.
Business and Relation Object Layer resides above POM layer. This layer represents actual entity to Business and its process. Mainly Business Analyst or Solution Architect interacts at this layer.  Business Object and Relation defines overall Data Model from Business process perspective.
Business Rules are the top level layer of Data Model. This layer basically constitutes Business Object behavior based on the rules configured in BMIDE. Business rules along with Business Object encapsulate overall PLM business process. Teamcenter provided both configurable like naming rule, conditions etc or custom like extension for defining business rules.
Below diagram shows the basic building block of Teamcenter Data Model.
 

POM Schema of Teamcenter Data Model:
Teamcenter Data Model Schema is hirierachy based, it means there is base level object through which all the object in the stystem are derived. The base object in Teamcenter is called POM_object. It is base parent object for all object defined in Teamcenter. POM level object  are represented as tables in Teamcenter data base. All derived class of Teamcenter Data Model is represented as corresponded table in data base. Under POM_object classes there many immediate child classes which are mainly used as storage classes like form storage class. Out of which one important class is POM_application_object class. This is important class from perspective of it actually representing all Business object of Teamcenter.  Workspace object which represent as parents of all objects which user can see in the teamcenter is derived from POM_application_object class.
All Business classes in Teamcenter either directly or indirectly (through hierarchy) is derived from workspace object. For example Item class is derived from workspace object. Same is valid for Folder, Dataset or ItemRevision. Below diagram shows the class hierarchy for basic workspace object.
 

Most of time you create custom type by extending data model of Item or form type. Once deploy from BMIDE, it will create a new table in Data base with columns having custom attribute defined in BMIDE. All inherited classes automatically inherit parent attributes. Hence child attributes are combination of parent attributes plus child attributes.
Business Object:
The building block of Teamcenter is Business Object. It resides above POM Objects or DB Classes. Business Object can be seen as actual representation of real life entity which are encapsulated as Business object. The underlining objects are still persistence schema classes.  Teamcenter UA provides hundred of OOTB business objects. Following are major characteristic of Business Object.
1)      Business Objects are related to each other through relations.
2)      Business Objects have property which can be persistence (attributes from underlining classes) or Dynamic (evaluated run time).
3)      Business Objects behavior can be controlled through rules which are defined in BMIDE. Rule can be either configurable (Ex: Naming Rules) or customization (extension, user_exit etc).
GRM Relation: Teamcenter Relation is second building block. Relation defined the inter dependent of various Business Object with each others. In Teamcenter Relation can be categorized in to two groups.
a)      Reference by : The Business Object underline schema classed direct has reference to other object through attributes. It can be compare to pointer reference to other classes in object orient concept. For example POM_application object has reference to owning group or user.
b)      GRM Relation : Other way relation between is created by creating a relation object which encapsulate both Business object through concept of primary and secondary object. Advantage of using GRM relation rather than direct relation is that of having more flexibility in term of defining business rules. For example you can define DeepCopy Rules or GRM Rules. Also different relation type object can be created to defined different Business rules.
Property:
Properties define business objects. All attributes which are present in underline POM Class for given Business Object are automatically become property of Business Object. Apart from persistence property, there are other properties which are either derived from other relation object or created run time by writing custom codes. Teamcenter property can be classified in following four categories.
a)      Persistence Property: Attributes which are stored in database. This are defined in underline schema classes.
b)      Compound Property: It a property which basically propagates property of other object which is related to target business object through either reference or relation. Example of this can Form property shown at Item or Item Revision.
c)       Runtime Property: These are property define dynamically through custom code. The custom code required to be written, which executes when the property value is fetch from server.
d)      Relation: This is property which defines relation between target object and source.
That’s all from Teamcenter Basic Data Model Perspective. Hope this provide good starting point  for people who want to understand Teamcenter Data Model.
 Source: http://teamcenterplm.blogspot.in

Monday 29 July 2013

DO MORE with Your PLM Software

We all know Aras is a full-featured enterprise PLM system, offering functionality such as complex configuration management, change management, product management, etc., but what you may not realize is that Aras goes way beyond "traditional PLM."

Recently I sat down with Rob McAveney, Director of Product Management, and he told me about several different ways in which customers are using Aras, including tooling management, formula & recipe management and customer relations management.

Aras manages the entire product lifecycle, not just the CAD design phase. Aras is involved in manufacturing planning & execution, quality systems, the extended supply chain and maintenance, repair & overhaul.
Watch the video to learn more and see how you can apply Aras in your business.


Tuesday 23 July 2013

Change Management: There is No Silver Bullet

  Register Now!

When it comes to change management, there is no one size fits all. While some may argue there are "industry best practices" the reality is that your business has unique change process requirements. After all, if every company in your industry had the same processes, no one would have a competitive advantage.
Watch this short video and learn why "best practices" means what's best for your business. Then register for the upcoming webcast, Customizing Enterprise Change Management on Wednesday July 24th at 11AM ET.



  Register Now!

Monday 15 July 2013

Product Change Management: Know Your Options - Webcast

As the pace of global product development increases, you’ve got to manage ever greater change complexity. Your organization is making continuous revisions and versioning to complicated electro-mechanical assemblies, product information and system configurations. Fortunately, Aras has a series of change management options available right out of the box. From a simple ECO process that’s ready-to-use to full CMII compliant PR/ECR/ECN workflows with sophisticated impact analysis.

Global product development processes are becoming increasingly complex. As a result, you need to manage very sophisticated change management processes across electro-mechanical assemblies, product information and system configurations, as well as throughout different geographies and with a wide range of external supply chain partners.  You need a PLM solution that fits your business and your unique proprietary practices. And you need to be able to quickly and continuously customize that solution to keep pace and maintain your competitive advantage.  At Aras we've got you covered. Aras has a series of change management workflow options available right out of the box -- from the full 4-Star certified CMII compliant Change Management process with sophisticated impact analysis to a simple ECO option that's easy, fast and ready-to-use. What's more, all the Aras change process options are flexible and easy to customize to your business needs.



Wednesday 10 July 2013

Be Different With Your PLM Choices.

Whether it's the technology, the way you buy your PLM or the kinds of people you do business with, you’ve got options and you’re in control. And we encourage you not to settle for anything less than a fair solution that addresses your challenges and works in your business. We pride ourselves with Being Different and we invite you learn more about what that means for you.

DIFFERENT IS...
Realizing each company has unique and competitive data and processes.
And developing software that adapts to fit your business, rather than expecting your business to change to fit the software.

DIFFERENT IS...
Accepting that healthy companies change and they should.
In fact, your company should change frequently to grow, compete and improve.
Technology should support real-time change and easy customization that anyone can use.

DIFFERENT IS...
Expecting to try before you buy,
and insisting that your buying decisions are never, ever made based on the word of a software salesman.

DIFFERENT IS...
Giving your company control over its own destiny.
This means you are never locked in to proprietary solutions, restrictive licensing or services.

DIFFERENT IS...
Allowing your company to own its data, with the freedom to access, share and extract it whenever you want, however you need.

DIFFERENT IS...
Never accepting the status quo.




Friday 5 July 2013

The Benefits of PLM-based CAPA Software

For manufacturers in industries that produce some of the world’s most complex products, effective quality management continues to be a competitive advantage. Whether in automotive, aerospace and defense, industrial equipment, electronics, or medical devices, companies are increasingly moving to a customer-centric model that includes social media monitoring and big data analytics at the point of sale. Even slight quality issues can have a ripple effect felt through the entire organization.

In today’s mobile and always connected environment, it is critical that discrete manufacturing companies can quickly and effectively sense and respond to quality issues originating anywhere in the value chain. This is possible with a holistic approach to quality. However, there is only a small minority of companies that have aligned the necessary leadership, business process, and technology capabilities to start taking an enterprise approach to quality.

This Research Spotlight aims to highlight best practices for managing quality across the enterprise, specifically as it pertains to creating a closed-loop quality management environment with the use of PLM-based corrective and preventive action (CAPA) software. It will touch on the following areas:
  • Market drivers pressuring discrete manufacturers to focus on improving the quality of processes and products
  • Addressing market drivers by taking a PLM approach to quality software
  • A look into the Key Performance Indicators (KPIs) companies are using to measure the effectiveness of quality initiatives
  • The people, business process, and change management capabilities needed to ensure successful technology deployments
  • Actionable recommendations for deploying PLM-based CAPA functionalities

By reading this Research Spotlight, executives on the quality management journey will be able to refine their approach to quality software, enabling communication and collaboration across the value chain. This holistic approach to quality will help create a market-leading customer experience in today’s unforgiving environment.

Monday 1 July 2013

Luxion and Siemens Introduce New Integration for Solid Edge Users

Luxion, makers of KeyShot® and the leading developer of advanced rendering and lighting technology, recently announced the immediate availability of a custom developed plugin that tightly integrates KeyShot with Siemens PLM Software’s Solid Edge® software making it a preferred rendering solution for users of Solid Edge. This plugin is available free of charge from the KeyShot website.

With KeyShot, Solid Edge users now have the option to increase the quality of their photorealistic 3D renderings, animations and interactive visuals used in communicating concepts, delivering internal presentations, developing digital prototyping and creating sales or marketing visuals. The integration allows KeyShot to be launched directly from the Solid Edge interface, automatically sending both the design and all the assigned material appearances to KeyShot.

Through the speed of KeyShot’s real-time ray-tracing interface, further refinement of material assignments and lighting is quickly accomplished, and the changes are immediately displayed to the user. The plugin implements LiveLinking™ which allows users to integrate KeyShot deeply into the product development process by pushing any changes to the Solid Edge design directly to KeyShot without losing any of the material assignments, animations, lighting and camera settings. This is the tightest integration of KeyShot available in the 3D CAD software market.  This unprecedented integration enables users to save time and improves the efficiency of designers, engineers and others creating 3D visuals.


The integration with KeyShot also allows users to transfer material assignments over to KeyShot for further development and for the creation of KeyShot animations or KeyShotVR’s, an interactive visual to present models on browsers or mobile devices.

Dan Staples, Director, Solid Edge Product Development, Siemens PLM Software says, ”The integration of KeyShot with Solid Edge is an important step in providing our users with a smooth and effective method for creating high-quality, 3D visuals. Changes in the Solid Edge design can be immediately reflected in the KeyShot environment, allowing high-quality visual communication to become a key part of the design process.”

“We are recognizing the momentum of Solid Edge in the industry”, said Thomas Teger, Vice President of Products and Strategy at Luxion. “The Solid Edge user base spans a wide variety of industries where visuals are part of the entire design process. The integration of KeyShot with Solid Edge provides even more flexibility for them as a solution that delivers amazing images of their designs within a matter of minutes.”

Pricing & Availability
The plugin for KeyShot and Solid Edge ST6 will be available free of charge from the KeyShot website at www.keyshot.com/plugins. 

  

Monday 24 June 2013

Siemens PLM Software Digital Manufacturing Symposium

Smart, Fast, Lean Manufacturing – Make Smart Decisions and Build Better Products with Tecnomatix Digital Manufacturing Solutions

Siemens PLM Software is pleased to announce that our annual Digital Manufacturing Symposium is back in South America. Located in the heart of São Paulo, Brazil on June 24th, 2013, this event provides an excellent opportunity to hear and learn from leaders in the manufacturing world as they discuss digital manufacturing trends, strategies and successes within their own organizations. Once again, these powerful discussions will be backed by our own Siemens staff, as they demonstrate some of the key capabilities found within the Siemens PLM Software digital manufacturing solution set.


As we learned in our Indianapolis event last year, manufacturing technology consumption continues to increase – so the need to drive manufacturing efficiency and productivity is becoming increasingly more important. Our past Brazil and Indianapolis events each brought together over 200 prospective and experienced digital manufacturing clients, and we are looking forward to an even larger attendance as we make our second visit to South America.

Our agenda is shaping-up as you read this initial information, but here is a preview of what is in store for the Digital Manufacturing Symposium this year:

Leading Siemens PLM Software digital manufacturing customers will give firsthand accounts of their digital manufacturing successes. (More details coming soon.) Senior executives and staff of the Manufacturing Engineering Software segment within Siemens PLM Software will discuss the current state of digital manufacturing tools, as well as the future of digital manufacturing and the role Siemens PLM Software will play in it. The focus of this year’s event is “Smarter Decisions, Better Products.” Check back frequently as we continue to develop our agenda and secure speakers for the event.

Thursday 20 June 2013

Out of the Box PLM. Seriously?

I read the headline and couldn't believe my eyes: "PLM: Are You Ready for Out-of-the-Box?"
I couldn't believe that a fellow PLM provider was touting an "out of the box" solution in 2013. In the age of personalization and mass customization isn't that kind of, um, backwards?

It started out well enough and I agreed with the premise... "Keeping up with constantly changing trends is a significant challenge which requires retailers and brands to stay in touch with consumers' needs and interests. You're trying to accelerate innovation, margins, and speed while still maintaining a competitive advantage."
 
 But it all fell apart right about here... "The concept of an 'out-of-the-box' solution is built on years of proven best practices and provides faster time to value and lower total cost of ownership."
Time out. Who's best practices is this solution built on? Unless it was built on my company's best practices, it's probably not going to work for me "out of the box". And even if it did, by the time it was installed and rolled out, my best practices would likely have changed.

As Aras CEO Peter Schroer recently said about one-size-fits-all change management solutions, "the 'best' really depends on the company, their customers, their product lines, their compliance mandates, etc."
Are the best practices of an aerospace company the same as a high tech electronics manufacturer? And are their best practices the same as a food and beverage company?

What about 2 companies in the same industry? Aren't their unique processes part of their competitive advantage? Is Louis Vuitton's process the same as Calvin Klein's and Armani's? What are companies giving up by adopting the same processes as their competitors?

And then the other shoe dropped... "Focus your team on making a difference for your customers, not on customizing your PLM solution."
How can you make a difference for your customers if you're trying to fit your business into someone else's preconceived idea of how it should work?

Full disclosure, Aras offers a lot - A LOT - of functionality right out of the box. But we don't expect you to use it - well, not without customizing it, which we've made as easy as drag and drop.
I believe out-of-the box PLM is simply bad business. And we at Aras aren't the only ones who think so. Read what Jos Voskuil, Chad Jackson and others had to say in a previous post, OOTB PLM is Hit or Miss.
I'm curious to hear what users think about this. Is OOTB PLM a blessing or a curse?

Source : www.aras.com

Sunday 16 June 2013

eBOM and mBOM configuration management

Manufacturing bill-of-materials or mBOM is a configuration of the product to show how it will be assembled. On the other hand, engineering bill-of-materials or eBOM is a configuration of the product to show how it is designed. The ability to connect and manage these two structures together so that whenever there are changes made to the product design it triggers a corresponding change in the manufacturing processes is an essential part of any successful PLM implementation. I guess it doesn’t require lots of explanation how a seamless eBOM and mBOM management process can save you a great deal of time and money. Now we need proof that it is indeed possible to manage eBOM and mBOM in a single software system and check impacts of design change on manufacturing processes. 

Watch the below video and check how Teamcenter can help in bridging the gap between eBOM and mBOM




Aras PLM Surface App - Puts the Information You Need at Your Fingertips

Bring your PLM software with you on the shop floor, manufacturing sites and everywhere you go with your Microsoft Surface. In an instant you can see all your assigned activities with the In Basket app. And you can act upon them right then and there. And with tasks listed for each activity, the attached items and more, you're sure to have all the information you need.

Watch as Nate Brown, Director of Product Management, shows you this app in action!