Thursday, October 3, 2013

Siebel Documentation is not Precise

I have been working on integrating Siebel and Taxware as I mentioned in my last post.  Everything was working fine on my local Siebel environment.  A development server finally became free and I checked in my changes.  That is when the "fun" started!

I started receiving a "Data type 0x6C has an invalid precision or scale." error in the Siebel log files.  I re-read the documentation in Siebel Tools and found:

Ok, so scale = total number of digits to the left and right of the decimal while precision = maximum number of digits to the right of the decimal.

FALSE!

Then, I look at the existing Number fields in the S_ORDER table:
As you can see the Precision is greater than the Scale.  That wouldn't make sense if you believed the Siebel Tools documentation.

We are using SQL Server for our Siebel database, so I decided to look up what Microsoft says about Precision and Scale:
Precision = total number of digits to the left and right of the decimal while scale = maximum number of digits to the right of the decimal.  That makes sense!

This goes to show don't believe everything your read.  I set the Precision to 22 and Scale to 7 in both the business components and tables, checked in, compiled and now I can save records to my database without error.

Tuesday, October 1, 2013

Consolidate and Eliminate Code using AutoMapper

I was recently assigned a project to write a WCF service that acted as a middleware between Siebel and Taxware.  The service would be called when a Quote or Order needed to have up to date taxes calculated.  After some re-factoring I was able to come up with a project structure that worked great with AutoMapper.  AutoMapper is a convention based object to object mapper and you can find more details here.

The previous version of AutoMapper had the stereotype of being slow, but newest version is right on the money.  I split my WCF service into 5 main projects:

  1. A project for the WCF service endpoints and host factory
  2. A project for the implementation of those endpoints
  3. A project for request/response data contracts of my service
  4. A project for business logic
  5. A project that contains Taxware schemas
Taxware is a unique system.  If you choose to go with the hosted environment option, then you have a .NET client dll that sits between your WCF service and the Taxware API.  This .NET client accepts a string representation of XML.  This means that I had to invoke the DataContractSerializer after I used AutoMapper to get a string representation of the objects that I populated.

I have 2 levels of mapping within my WCF service solution:
  1. Data Contract to Business Logic Models
  2. Business Logic Models to Taxware Schema Models
As a result, I have 2 classes that implement AutoMapper.Profile and 1 MappingFactory that adds the Profiles to the Mapper object in the AutoMapper namespace.  The MappingFactory gets invoked from the Global.asax Application_Start method which caches the mappings once the application is instantiated on the server.

The Data Contracts and the Business Logic Model objects are very similar, so the mappings are easy to read.  I have an Account data contract class and and Account domain model class.  I can write the following statement in my profile:
AutoMapper.Mapper.CreateMap<Account, domain.Account>();

The attributes of the class objects do not have to be explicitly mapped because they have the same names.  These attributes will be automatically mapped unless otherwise specified.

The Business Logic Model objects and the Taxware schema objects are drastically different, so the mapping is a bit more intense.  None of the attribute names are shared which means every Taxware schema object attribute has to be explicitly mapped.  If you do not tell the AutoMapper how to handle a destination attribute, then the MappingFactory will throw an error at start up when Mapper.AssertConfigurationIsValid(); is called.  I quickly made use of the ignore functionality because I do not have nor need a value for each of the Taxware schema attributes.
AutoMapper.Mapper.CreateMap<Document, Doc>()
                .ForMember(tw => tw.currn, m => m.UseValue(ConfigurationManager.AppSettings["Currency"]))
                .ForMember(tw => tw.custAttrbs, m => m.Ignore())
                .ForMember(tw => tw.custmsDutTrfAmt, m => m.Ignore())

Originally, my solution only had 1 profile.  That was because I worked on my WCF service before I starting working on the ASP.NET MVC 4 application requirements.  This application had to be able to take existing tax calculations and force those values into Taxware.  This requires 2 calls; 1 to calculate tax on the document and return the tax jurisdictions and 1 call to force the amount of tax already collected to be split up by the appropriate tax jurisdiction rate.  Adding this application meant that I needed to add another profile to a new ASP.NET MVC 4 project.  This profile maps the Web Models to the Business Logic Models.  I can re-use the Business Logic Model to Taxware Schema mappings that I wrote for the WCF service with additional mappings for the force call.  

So now I have 2 solutions that get deployed by the Team Foundation Server; 1 for the WCF service and 1 for the ASP.NET MVC 4 application.  The 2 solutions only require the projects necessary for them; in other words, I don't have to include the data contracts project in my ASP.NET MVC 4 solution.  

Benefits
  1. All of my fellow developers know to look at the AutoMapper.Profile implementations to find the object to object mappings.
  2. Business logic classes don't have code to new up objects and map values.
  3. All mappings are cached when the application is started.
Things I don't like
  1. I had trouble mapping parent objects that contained child arrays.  The child arrays were not being populated correctly.  I worked around this by mapping the child arrays separately from the parent map.
  2. If I change a Business Logic Model class to accommodate a WCF service change, then I have to remember to make a mapping change to the ASP.NET MVC 4 project's profile that maps Web Models to Business Logic Models.  This isn't a huge deal, because the deployment will fail when the ASP.NET MVC 4 project is recompiled which will remind me to make the change.




Monday, September 30, 2013

Siebel Data Maps Make Integration a Piece of Cake

For the last 9 years I have been working with the widely criticized Siebel CRM.  Sometimes I feel that I am in a small clan of Siebel advocates.  Yes, there are headaches but it is a huge system that has been developed over a long period of time.  If you can overlook some of the annoyances then you can find a tool that has a lot of functionality already in place for you.  Even if there are 3 different ways to accomplish the same thing!

Developers have a hard time appreciating Siebel because Siebel/Oracle best practices are to configure the repository instead of writing custom code.  Developers love to write custom code for a variety of reasons.  Most of the time it is quicker and the developer doesn't have to spend time trying to determine what has already been implemented in the system.  As your maturity level grows with configuring Siebel you start to find useful tools that are available "Out Of The Box" (OOTB) which ease development and maintenance.  One of these tools is the integration data map.

Using data maps and the EAI Siebel Adapter business service together in workflows can be a dynamic duo that is just right for integration requirements.  In this example I will be working with a Siebel outbound integration workflow that sends data from Siebel to an external system using a data map.

Sample OOTB data map


Here is a list of common steps that I like to take when setting up a new integration workflow.
  1. Create an integration object that is based off a Siebel business object and ONLY contains the fields and components needed.  Identify the fields needed for workflow processing and for the external system schema.  Inactivate all fields in the integration object that don't meet this criteria.
  2. Create a new integration workflow within Siebel Tools.
  3. Create a new workflow process property that has a data type of Strongly Typed Integration Object and has the correct Integration Object value that represents the integration object that will populated from querying Siebel.
  4. Import the WSDL file for the external web service into the Siebel repository.  Ensure that the WSDL file is located in the Siebel Tools > TEMP folder on your machine.  Open Siebel Tools and go to File > New Object > EAI.  Choose the option for web service and follow the steps in the wizard.  A new business service and integration object(s) will be created once the wizard is complete.
  5. Create a new workflow process property that has a data type of Strongly Typed Integration Object and has the correct Integration Object value that represents the integration object that will be sent to the external web service.  This integration object would have been created when you imported the WSDL file.
  6. Allow the workflow Object Id process property control the database context of your workflow.  The Object Id will be the ROW_ID of the primary business component of the business object the workflow is based on.  The business object is defined when you first create the workflow.  If your workflow is invoked via a button click in the UI, Runtime Event, Workflow Batch Job, or Signal then the Object Id will be populated for you when the workflow is invoked.
  7. Create a business service workflow step to query the database records using the EAI Siebel Adapter business service and Query method.  Provide the PrimaryRowId value which will be the Object Id and provide an OutputIntObjectName which will be the name of the IO that is used for querying.  Add an Output Argument which will populate the corresponding process property with the Output Argument Siebel Message from the EAI Siebel Adapter.
  8. Create a new data map in the Siebel UI.  Navigate to Administration - Integration > Data Map Editor.  Here you can map the integration object that you use to query Siebel to the integration object that was created from importing the WSDL file.  This is where the meat of you business logic lives and you can use different tools based on your requirements.  There are functions available to you in the maps such as the IIF function.  A common use of IIF is to map boolean values (e.g. IIF([Integration Field] = 'Y', 'true', 'false').  You can also add custom input parameters to your Data Map.  Navigate to the Data Maps hyperlink at the top of the page under Administration - Integration.  If you define a parameter here, then it can be referenced with the "&" symbol (e.g. [&MyParam]) within the Data Map Editor.
  9. Create a step in your workflow to invoke your new data map.  Add a new business service step that uses the EAI Data Transformation Engine business service and the Execute method.  Add the MapName, OutputIntObjectName, and SiebelMessage input parameters.  The SiebelMessage parameter will be the Integration Object workflow process property you added for querying Siebel.  Add an output parameter for the workflow process property you added for the external web service input.
  10. Create a step in your workflow for calling the external web service.  The business service name and method will match the business service name and method that was created when importing the WSDL.  Add an input parameter for the workflow process property that you added for the external web service input.
Now you have a workflow that queries Siebel to get a XML hierarchy of data, transforms that data, and finally sends the data to an outbound web service.   While unit testing you may run into various data issues.  You can update the data map administratively without the need of a Siebel compile!

Finally, you might have the requirement to receive data back from the external web service and store within Siebel.  For that you can add another data map into your system; map the output of the web service call to your existing Siebel integration object; and add a new workflow step to Insert, Update, Upsert, Synchronize, Execute, etc the integration object using the EAI Siebel Adapter.  Now you have data coming from an external web service that can be edited Administratively before it is saved to the Siebel database.