Category Archives: Integration

New SharePoint 2010 & 2013, Online Connector for Outlook available!!

SharePoint Connector  For Outlook makes it easier for office users to upload emails to SharePoint and attach SharePoint documents to an email message .

 

Please contact me through my blog or at tomas.floyd@outlook.com for more information on pricing, licensing and trials.

 

   SharePointOutlookConnectorAttachingSharePointOutlookExplorer_2

FEATURES

  • Workflow integration
  • Attach items as attachments into email item functionality
  • Multiple site configuration
  • Check Out/Check In functionality
  • Drag & Drop files(attachments) into a folder
  • Copy, delete and open item/document functionalities
  • Version History for files
  • Saving email meta data to SharePoint document item (title, to, from etc) functionality
  • Works with all SharePoint versions
  • Saving email message as list item and attachments as attachment of the list item functionality
  • Template based search functionality
  • Windows explorer right click upload
  • Editing metadata on uploading document
  • Multiple site configuration
  • File System Integration has been added. (This allows you to integrate live mesh folders into outlook)
  • Attach items as attachments into email item functionality
  • Advanced Alert System
  • Library Content Viewer
  • Saving email message as list item and attachments as attachment of the list item functionality

How To : Design the Physical Architecture to Support Collaborative Development and ALM of SharePoint Foundation 2010 Application

Introduction

This article explains the physical architecture which fits best in collaborative development and ALM of SharePoint Foundation 2010 application and what are the servers and tools needed and how they play key roles in ALM of SharePoint Foundation 2010. The purpose of this article is to provide overall understanding of various servers and farms connected to each other in SharePoint Foundation.

Background

Basic understanding of different server OS & SharePoint Foundation 2010 is required.

Solution

Application Life-cycle Management (ALM) is the co-ordination of development life-cycle activities—including requirements, modeling, development, build, and testing. Recently, ALM has expanded beyond the application and the software development life cycle to also include business solution governance, infrastructure management, operations, and support.

You can use ALM to help align your organization in the context of a software solution in business, development, and operations. With an application development platform that supports ALM, you can provide integration between the various tools used and activities performed within each of these capabilities.

There are main four types of staging servers with standalone developer’s environment which plays a key role in ALM of SharePoint 2010 application:

  1. Development SharePoint Farm
  2. Team foundation server
  3. Integration/Testing Farm
  4. Production Farm
    +
    Developer’s Workstation

The below figure is a physical architecture which depicts how each sever is interconnected to support collaborative development and ALM for SharePoint Foundation 2010 application:

Click to enlarge image

Development SharePoint Farm

A SharePoint farm is fundamentally a collection of SharePoint role servers that provide for the base infrastructure required to house SharePoint sites. The farm level is the highest level of SharePoint architecture, providing a distinct operational boundary for a SharePoint environment. Each farm in an environment is a self-encompassing unit made up of one or more servers, such as web servers, service application servers, and SharePoint database servers.

SharePoint development farm needed for the developers in an organization that makes heavy use of SharePoint often need environments to test new applications, web parts, solutions, and other SharePoint customization. These developers often need a sandbox area where these farm level features and solutions can be tested.

I have considered two-tier topology for SharePoint Foundation 2010 farm. However it will be entirely based on the need of your application. If your application is a relatively small intranet application, then you can choose single tier topology or if you are going to integrate other search server with foundation, then you can choose three-tier topology with application server as a middle tier (Remember that SharePoint Foundation 2010 doesn’t include enterprise search). It may make sense to deploy one or more development farms so that developers have the opportunity to run their tests and develop software for SharePoint independent of the existing production environment.

There are basically two types of servers included in two-tier development farm of SharePoint Foundation 2010:

  1. Web server
  2. Content database server

In the above figure, there are three front-end web servers and one SharePoint content database server. However you can choose a single front-end web server connected to content database server based on your application need and architecture of production environment. All web servers share the same content database. This is called two-tier deployment farm where SharePoint server component and content database are installed on separate server. As I mentioned before, you can choose one-tier, two-tier or three-tier deployment topology based on your application architecture and topology of production architecture.

Each web server has SharePoint Foundation 2010 and SharePoint extension for TFS 2010 install on it. It needs SharePoint extension for TFS 2010 to connect with Team Foundation Server for source control, build management & project management.

Advantage of Development SharePoint Farm:

  1. Single place where SharePoint Admin can integrate all the final artifacts from multiple developers.
  2. Developer can sync with latest SharePoint site on its standalone developer workstation.
  3. Admin can easily approve artifacts and migrate to integration server.
  4. It is a unit testing environment for developers where they can test dependent functionality or farm level features.

Team Foundation Server

Team Foundation Server plays a key role in ALM which provides source control, build management and work item. You can have TFS installed on the same server which has content database server but if you are going to use build management of TFS, then it is advisable to have separate Team Foundation Server because it utilizes CPU intensively when it processes the builds.

As per the above figure, there are separate Team foundation servers which are connected to SharePoint Farm as well as standalone development workstation so that it can provide source control for customized content as well as developer’s artifacts and resources.

Advantages of TFS
  1. Source control for SharePoint artifacts and customization
  2. Build management for SharePoint
  3. Work item and bug tracking tool for SharePoint
  4. Admin console for all management activity
  5. Easy integration with SharePoint foundation server and VS 2010
  6. Easy check-in & check-out
  7. Web based console to manage ALM activity

Developer’s Workstation

As per the above figure, developers’ environment includes two developers workstation. In practice, you can take as many workstations as your development team size.

Developer workstation should have Windows 7 or Windows vista operating system with standalone SharePoint foundation server with local content database. So that one developer’s work doesn’t affect another developer and he can debug artifacts locally.

Developer workstation will include the following stuff installed:

  1. Windows 7 or Windows vista 64 bit OS
  2. Stand alone SharePoint Foundation server 2010
  3. SharePoint designer 2010
  4. Visual Studio 2010 (connected to TFS)

Developer workstation should be connected to Team Foundation Server 2010 so that when developer finally completes his artifact, then he can check-in his artifact in TFS so that other developers can take the latest code from TFS if needed. This way, parallel development can happen without affecting other developer’s work.

Integration/Testing Farm

Any production SharePoint environment should have a test environment in which new SharePoint web parts, solutions, service packs, patches, and add-ons can be tested. It is critical to deploy test farms, because many SharePoint add-ons could potentially disrupt or corrupt the formatting or structure of a production environment, and trying to test these new solutions on site collections or different web applications is not enough because the solutions often install directly on the SharePoint servers themselves. If there is an issue, the issue will be reflected in the entire farm.

Integration or testing server farm should be similar to the existing environments, with the same add-ons and solutions installed and should ideally include restores of production site collections to make it as similar as possible to the existing production environment. All changes and new products or solutions installed into an environment should subsequently be tested first in this environment.

Integration/testing servers will have final SharePoint sites and site collection as per the business requirements. QA will test all the business functionality here. Customer can also do their ‘User acceptance test’ before going live to the production server.

After user acceptance test passed, all the sites & site collection will be deployed on production server.

Advantage of Integration testing server:

  1. Clean environments and same physical architecture as production
  2. QA can test all dependent business functionality at one place
  3. Customer can participate in UAT
  4. Easy deployment/migration from integration testing server to production server

Production Farm

The final stage is rolling your farm into a production environment. At this stage, you will have incorporated the necessary solution and infrastructure adjustments that were identified during the user acceptance test stage. These servers are generally in the customer’s premises. Development team and testing team do not have control over it.

There are various 3rd party tools available in the market for SharePoint data protection, administration, migration, compliance and integration.

ImageGen[1]

Summary

So this way, you can design physical architecture where Development SharePoint Farm and developer’s workstation are integrated with TFS 2010. TFS and Content database are connected to testing server or testing farm where all the artifacts and content will be integrated in testing server for QA and UAT. Finally after UAT, it will be deployed on production farm.

You can use VM (Virtual Machine) for all the servers and workstation for effective infrastructure because if server crashes due to some reason, then you can quickly create a new VM for the needed OS from images.

Note: In the above figure, integration/Testing farm and production farm is a single server just for clear understanding but it will be as large as development farm with number of front-end web server and content database server in reality. All the server OS is Windows Server 2008 R2 SP2 64 bit. Please visit here for more information on hardware & software requirements for SharePoint Foundation 2010.

Features from SharePoint 2010 Integration with SAP BusinessObjects BI 4.0

ImageOne of the core concepts of Business Connectivity Services (BCS) for SharePoint 2010 are the external content types. They are reusable metadata descriptions of connectivity information and behaviours (stereotyped operations) applied to external data. SharePoint offers developers several ways to create external content types and integrate them into the platform.

 

The SharePoint Designer 2010, for instance, allows you to create and manage external content types that are stored in supported external systems. Such an external system could be SQL Server, WCF Data Service, or a .NET Assembly Connector.

This article shows you how to create an external content type for SharePoint named Customer based on given SAP customer data. The definition of the content type will be provided as a .NET assembly, and the data are displayed in an external list in SharePoint.

The SAP customer data are retrieved from the function module SD_RFC_CUSTOMER_GET. In general, function modules in a SAP R/3 system are comparable with public and static C# class methods, and can be accessed from outside of SAP via RFC (Remote Function Call). Fortunately, we do not need to program RFC calls manually. We will use the very handy ERPConnect library from Theobald Software. The library includes a LINQ to SAP provider and designer that makes our lives easier.

.NET Assembly Connector for SAP

The first step in providing a custom connector for SAP is to create a SharePoint project with the SharePoint 2010 Developer Tools for Visual Studio 2010. Those tools are part of Visual Studio 2010. We will use the Business Data Connectivity Model project template to create our project:

After defining the Visual Studio solution name and clicking the OK button, the project wizard will ask what kind of SharePoint 2010 solution you want to create. The solution must be deployed as a farm solution, not as a sandboxed solution. Visual Studio is now creating a new SharePoint project with a default BDC model (BdcModel1). You can also create an empty SharePoint project and add a Business Data Connectivity Model project item manually afterwards. This will also generate a new node to the Visual Studio Solution Explorer called BdcModel1. The node contains a couple of project files: The BDC model file (file extension bdcm), and the Entity1.cs and EntityService.cs class files.

Next, we add a LINQ to SAP file to handle the SAP data access logic by selecting the LINQ to ERP item from the Add New Item dialog in Visual Studio. This will add a file called LINQtoERP1.erp to our project. The LINQ to SAP provider is internally called LINQ to ERP. Double click LINQtoERP1.erp to open the designer. Now, drag the Function object from the designer toolbox onto the design surface. This will open the SAP connection dialog since no connection data has been defined so far:

Enter the SAP connection data and your credentials. Click the Test Connection button to test the connectivity. If you could successfully connect to your SAP system, click the OK button to open the function module search dialog. Now search for SD_RFC_CUSTOMER_GET, then select the found item, and click OK to open the RFC Function Module /BAPI dialog:

SP2010SAPToBCS/BCS12.png

The dialog provides you the option to define the method name and parameters you want to use in your SAP context class. The context class is automatically generated by the LINQ to SAP designer including all SAP objects defined. Those objects are either C# (or VB.NET) class methods and/or additional object classes used by the methods.

For our project, we need to select the export parameters KUNNR and NAME1 by clicking the checkboxes in the Pass column. These two parameters become our input parameters in the generated context class method named SD_RFC_CUSTOMER_GET. We also need to return the customer list for the given input selection. Therefore, we select the table parameter CUSTOMER_T on the Tables tab and change the structure name to Customer. Then, click the OK button on the dialog, and the new objects get added to the designer surface.

IMPORTANT: The flag “Create Objects Outside Of Context Class” must be set to TRUE in the property editor of the LINQ designer, otherwise LINQ to SAP generates the Customer class as nested class of the SAP context class. This feature and flag is only available in LINQ to SAP for Visual Studio 2010.

The LINQ designer has also automatically generated a class called Customer within the LINQtoERP1.Designer.cs file. This class will become our BDC model entity or external content type. But first, we need to adjust and rename our BDC model that was created by default from Visual Studio. Currently, the BDC model looks like this:

Rename the BdcModel1 node and file into CustomerModel. Since we already have an entity class (Customer), delete the file Entity1.cs and rename the EntityService.cs file to CustomerService.cs. Next, open the CustomerModel file and rename the designer object Entity1. Then, change the entity identifier name from Identifier1 to KUNNR. You can also use the BDC Explorer for renaming. The final adjustment result should look as follows:

SP2010SAPToBCS/BCS4.png

The last step we need to do in our Visual Studio project is to change the code in the CustomerService class. The BDC model methods ReadItem and ReadList must be implemented using the automatically generated LINQ to SAP code. First of all, take a look at the code:

SP2010SAPToBCS/BCS6.png

As you can see, we basically have just a few lines of code. All of the SAP data access logic is encapsulated within the SAP context class (see the LINQtoERP1.Designer.cs file). The CustomerService class just implements a static constructor to set the ERPConnect license key and to initialize the static variable _sc with the SAP credentials as well as the two BDC model methods.

The ReadItem method, BCS stereotyped operation SpecificFinder, is called by BCS to fetch a specific item defined by the identifier KUNNR. In this case, we just call the SD_RFC_CUSTOMER_GET context method with the passed identifier (variable id) and return the first customer object we get from SAP.

The ReadList method, BCS stereotyped operation Finder, is called by BCS to return all entities. In this case, we just return all customer objects the SD_RFC_CUSTOMER_GET context method returns. The returned result is already of type IEnumerable<Customer>.

The final step is to deploy the SharePoint solution. Right-click on the project node in Visual Studio Solution Explorer and select Deploy. This will install and deploy the SharePoint solution on the server. You can also debug your code by just setting a breakpoint in the CustomerService class and executing the project with F5.

That’s all we have to do!

Now, start the SharePoint Central Administration panel and follow the link “Manage Service Applications”, or navigate directly to the URL http://<SERVERNAME>/_admin/ServiceApplications.aspx. Click on Business Data Connectivity Service to show all the available external content types:

On this page, we find our deployed BDC model including the Customer entity. You can click on the name to retrieve more details about the entity. Right now, there is just one issue open. We need to set permissions!

Mark the checkbox for our entity and click on Set Object Permissions in the Ribbon menu bar. Now, define the permissions for the users you want to allow to access the entity, and click the OK button. In the screen shown above, the user administrator has all the permissions possible.

In the next and final step, we will create an external list based on our entity. To do this, we open SharePoint Designer 2010 and connect us with the SharePoint website.

Click on External Content Types in the Site Objects panel to display all the content types (see above). Double click on the Customer entity to open the details. The SharePoint Designer is reading all the information available by BCS.

In order to create an external list for our entity, click on Create Lists & Form on the Ribbon menu bar (see screenshot below) and enter CustomerList as the name for the external list.

OK, now we are done!

Open the list, and you should get the following result:

The external list shows all the defined fields for our entity, even though our Customer class, automatically generated by the LINQ to SAP, has more than those four fields. This means you can only display a subset of the information for your entity.

Another option is to just select those fields required within the LINQ to SAP designer. With the LINQ designer, you can access not just the SAP function modules. You can integrate other SAP objects, like tables, BW cubes, SAP Query, or IDOCs. A demo version of the ERPConnect library can be downloaded from the Theobald Software homepage.

If you click the associated link of one of the customer numbers in the column KUNNR (see screenshot above), SharePoint will open the details view:

SP2010SAPToBCS/BCS10.png

 

 

XI/PI: Understanding the RFC Adapter

SAP XI provides different ways for SAP systems to communicate via SAP XI. You have three options namely IDoc Adapters, RFC Adapters and Proxies. In one of the earlier posts that spoke about your first XI scenario, we learned configuring the IDoc receiver adapter. And in the coming articles, I shall throw light on different adapters. This article specifically deals with understanding basics of RFC adapter on sender and the receiver side.
 
 Image
SAP XI provides different ways for SAP systems to communicate via SAP XI. You have three options namely IDoc Adapters, RFC Adapters and Proxies. In one of the earlier posts that spoke about your first XI scenario, we learned configuring the IDoc receiver adapter. And in the coming articles, I shall throw light on different adapters. This article specifically deals with understanding basics of RFC adapter on sender and the receiver side.

SAP XI RFC Sender AdapterRFC Adapter converts the incoming RFC calls to XML and XML messages to outgoing RFC calls. We can have both synchronous (sRFC) and asynchronous (tRFC) communication with SAP systems. The former works with Best Effort QoS (Quality of Service) while the later by Exactly Once (EO).

Unlike IDoc adapter, RFC Adapter is installed on the J2EE Adapter Engine and can be monitored via Adapter Monitoring and Communication Channel Monitoring in the Runtime Workbench.

Now let us understand the configuration needed to set up RFC communication.

RFC Sender Adapter

In this case, Sender SAP system requests XI Integration Engine to process RFC calls. This could either be synchronous or asynchronous.

On the source SAP system, go to transaction SM59 and create a new RFC connection of type ‘T’ (TCP/IP Connection). On the Technical Settings tab, select “Registered Server Program” radio button and specify an arbitrary Program ID. Note that the same program ID must be specified in the configuration of the sender adapter communication channel. Also note that this program ID is case-sensitive.

When using the RFC call in your ABAP program you should specify the RFC destination created above. For example,

CALL FUNCTION ‘<NAME_OF_THE_RFC_FUNCTION_MODULE>’
DESTINATION ‘<RFC_DESTINATION_NAME>’.

Also, in case you are setting up asynchronous interface, the RFC should be called in the background. For example,

CALL FUNCTION ‘<NAME_OF_THE_RFC_FUNCTION_MODULE>’
IN BACKGROUND TASK
DESTINATION ‘<RFC_DESTINATION_NAME>’.

SAP XI RFC Receiver AdapterNow, create the relevant communication channel in the XI Integration Directory. Select the Adapter Type as RFC Sender (Please see the figure above). Specify the Application server and Gateway service of the sender SAP system. Specify the program ID. Specify exactly the same program ID that you provided while creating the RFC destination in SAP system. Note that this program ID is case-sensitive. Provide Application server details and logon credentials in the RFC metadata repository parameter. Save and activate the channel. Note that the RFC definition that you import in the Integration Repository is used only at design time. At runtime, XI loads the metadata from the sender SAP system by using the credentials provided here.

RFC Receiver Adapter

In this case, XI sends the data in the RFC format (after conversion from XML format by the receiver adapter) to the target system where the RFC is executed.

Configuring the receiver adapter is even simpler. Create a communication channel in ID of type RFC Receiver (Please see the figure above on the left). Specify the RFC Client parameters like the Application server details, logon credentials etc and activate the channel.

Testing the Connectivity

Sometimes, especially when new SAP environments are setup, you may want to test their RFC connectivity to SAP XI before you create your actual RFC based interfaces/scenarios. There is a quick and easy way to accomplish this.

STFC_CONNECTION InputCreate a RFC destination of type ‘T’ in the SAP system as described previously. Then, go to XI Integration Repository and import the RFC Function Module STFC_CONNECTION from the SAP system. Activate your change list.

Configure sender and receiver communication channels in ID by specifying the relevant parameters of the SAP system as discussed previously. Remember that the Program ID in sender communication channel and RFC destination in SAP system must match (case-sensitive).

STFC_CONNECTION OutputAccordingly, complete the remaining ID configuration objects like Sender Agreement, Receiver Determination, Interface Determination and Receiver Agreement. No Interface mapping is necessary. Activate your change list.

Now, go back to the SAP system and execute the function module STFC_CONNECTION using transaction SE37. Specify the above RFC destination in ‘RFC target sys’ input box. You can specify any arbitrary input as REQUTEXT. If everything works fine, you should receive the same text as a response. You can also see two corresponding messages in SXMB_MONI transaction in SAP XI. This verifies the connection between SAP system and SAP XI.

FREE Microsoft Dynamics CRM 2011 List Component for Microsoft SharePoint Server 2010

 

 

CRM2011 – SharePoint 2010 Integration? Glue CRM 2011 & Share Point 2010 together? Make CRM 2011 and Share Point 2010 converse? I wasn’t sure what to call this exactly. “Hooking together” works for me!

Now that we have a CRM 2011 instance and a Share Point site working, let’s get them connected up! Go to this website and download Microsoft Dynamics CRM 2011 List Component for Microsoft SharePoint Server 2010:

Accept the License Terms.

Extract the files to a folder (I chose C:\CRM List).

You will get a prompt “The Installation is complete.” Click OK.

Let’s go over to the Share Point Central Administration Server to install the list component we just extracted. Connect to http://localhost:48835/ (your port might be different, be aware of this). Click Manage web applications.

Click the new Share Point site, and then “General Settings” (the blue cogs).

Scroll down to Browser File Handling and choose Permissive, Click OK.

Let’s head back over to our new Share Point Site. Click Site Actions up top left, and then “Site Settings”.

Under Galleries click “Solutions”.


Click the Word “Solutions” up top (you have to click the word “Solutions”, even though it looks selected), and then click “Upload Solution”.

Select the .wsp component that we extracted wayyy back at the top of this. I used C:CRM List as my extract folder. Click OK.

You’ll get prompted at this point, I couldn’t active the control on this screen (but it still needs to be done). We need to make sure some services are running to activate the solution. Click Close.

Head back to the Share Point Central Administration. http://localhost:48835. Found at

Click System Settings –> Manage Services on this server

Click Start beside “Share Point Foundation Sandboxed Code Service”. I also started “Microsoft SharePoint Foundation Subscription Settings Service (by accident)” so that’s why that ones started.

Now to head back to our Share Point site http://localhost:39083/

Under Galleries click “Solutions”.

Click Solutions again, select crmlistcomponent, and the click “Activate” up top. Activate is now un-greyed out! Click Activate!

The solution has now been activated! Hurray!

There seems to be some confusion whether or not you need to run a power shell script to enable Activation of Share Point 2010 solutions (AllowHtcExtn). According to what I’ve read, you would need to run this if Share Point 2010 is running on a domain controller. I didn’t have to do this (and we’re on a domain controller), and I’ve yet to run into a problem with .htc stuff. Even in the Microsoft Dynamics CRM 2011 Readme it says:
“If you are using Microsoft SharePoint Server 2010 (On-Premises), you must add .htc extensions to the list of allowed file types:
a. Copy the AllowHtcExtn.ps1 script file to the server that is running Microsoft SharePoint Server 2010.
b. In the Windows PowerShell window or in the SharePoint Management Console, run the command: AllowHtcExtn.ps1 .
Example: AllowHtcExtn.ps1 http://servername%E2%80%9D

Some people say the script works for them , and some say that using just the blog method (what we did) works
The sharepoint configuration is complete at this point. You probably want to take a snapshot, name it “After Sharepoint Configuration”. Let’s head over to our CRM server (localhost:85).

In CRM Click Settings –> Document Management –> Document Management Settings

Select the entities that you want to have documents enabled on. This will create a “Documents” area when you open an instance of the entity. I’ll just leave the defaults for now. At the bottom punch in your Share Point site that you’ve created and click Next. This is the Share Point server we installed the list component on. You’re not allowed to use localhost:port, just use the computer name:port like below.

Don’t select the box, otherwise it will relate the files to those entities. Without checking the box you will end up with something like Site/EntityName/Record Name (which is what I want, especially if you’re using custom entities). Click Next.

If “Libraries are being created in the path”, click Next.

Everything should “Succeed”, Click Finish.

Let’s test this bad boy out now.

Create a new account called “Test”.

Click Save! Click “Documents” on the left side. You’ll get a prompt saying that the folder (Test) is being created under “Account”. Click OK.

Click Add.

Now you’ll probably get these errors! /crmgrid/scripts/DialogContainer.js and 403 FORBIDDEN! Depressing. The only real information on this error was here: . It wasn’t very clear, but I stumbled through it. It seems that CRM 2011 doesn’t enjoy being called localhost. Let’s fix these up.

The fix for this was to run inetmgr –> Click Microsoft Dynamics CRM –> click Stop

Click “Bindings…” on the right side. Click “Edit” on the items that show “localhost” and change it to my machine name: “win-b80icqrvluf”. This is so it has a a “real” name to connect to.

Before:

After:

Now click “Start” on the right side.

Head back over to the CRM (http://win-b80icqrvluf:85/CRMTest/main.aspx) make sure to use the host name, as it might give you the error if you use localhost. Open your Test Account again.

Click Documents –> Add, you should now see this popup (it can take a while to load for the first time on the VM). If you continue to get the error, stop both CRM 2011 and Share Point 2010 servers and restart them. If that doesn’t work, try restarting the whole server.

Pick a file, and click OK.

The file should be uploaded to Share Point now.

Head over to Share Point at http://win-b80icqrvluf:39083 and click “All Site Content” or “Libraries”.

Click Account.

You can see that CRM has created a folder “Test” (for our record). It creates 1 folder per record. Click it to see the files associated to that record!!

The files associated to the record “Test” in Accounts.

Share Point and CRM have combined into a super awesome force of doom. But we’re still missing 1 core piece of functionality (due to not picking a port when we installed CRM).

 

 

SAP Weekend : Part 2 – Using the Microsoft BizTalk Server for B2B Integration with SharePoint

This is Part 2 of my past weekend’s activities with SharePoint and SAP Integration methods.

 

In this post I am looking at how to use the BizTalk Adapter with SharePoint

 

Topics

  • Abstract
  • Goal
  • Business Scenario
  • Environment
  • Document Flow
  • Integration Steps
  • .NET Support
  • Summary

 

Abstract

In the past few years, the whole perspective of doing business has been moved towards implementing Enterprise Resource Planning Systems for the key areas like marketing, sales and manufacturing operations. Today most of the large organizations which deal with all major world markets, heavily rely on such key areas.

Operational Systems of any organization can be achieved from its worldwide network of marketing teams as well as from manufacturing and distribution techniques. In order to provide customers with realistic information, each of these systems need to be integrated as part of the larger enterprise.

This ultimately results into efficient enterprise overall, providing more reliable information and better customer service. This paper addresses the integration of Biztalk Server and Enterprise Resource Planning System and the need for their integration and their role in the current E-Business scenario.

 

Goal

There are several key business drivers like customers and partners that need to communicate on different fronts for successful business relationship. To achieve this communication, various systems need to get integrated that lead to evaluate and develop B2B Integration Capability and E–Business strategy. This improves the quality of business information at its disposal—to improve delivery times, costs, and offer customers a higher level of overall service.

To provide B2B capabilities, there is a need to give access to the business application data, providing partners with the ability to execute global business transactions. Facing internal integration and business–to–business (B2B) challenges on a global scale, organization needs to look for required solution.

To integrate the worldwide marketing, manufacturing and distribution facilities based on core ERP with variety of information systems, organization needs to come up with strategic deployment of integration technology products and integration service capabilities.

 

Business Scenario

Now take the example of this ABC Manufacturing Company: whose success is the strength of its European-wide trading relationships. Company recognizes the need to strengthen these relationships by processing orders faster and more efficiently than ever before.

The company needed a new platform that could integrate orders from several countries, accepting payments in multiple currencies and translating measurements according to each country’s standards. Now, the bottom line for ABC’s e-strategy was to accelerate order processing. To achieve this: the basic necessity was to eliminate the multiple collections of data and the use of invalid data.

By using less paper, ABC would cut processing costs and speed up the information flow. Keeping this long term goal in mind, ABC Manufacturing Company can now think of integrating its four key countries into a new business-to-business (B2B) platform.

 

Here is another example of this XYZ Marketing Company. Users visit on this company’s website to explore a variety of products for its thousands of customers all over the world. Now this company always understood that they could offer greater benefits to customers if they could more efficiently integrate their customers’ back-end systems. With such integration, customers could enjoy the advantages of highly efficient e-commerce sites, where a visitor on the Web could place an order that would flow smoothly from the website to the customer’s order entry system.

 

Some of those back-end order entry systems are built on the latest, most sophisticated enterprise resource planning (ERP) system on the market, while others are built on legacy systems that have never been upgraded. Different customers requires information formatted in different ways, but XYZ has no elegant way to transform the information coming out of website to meet customer needs. With the traditional approach:

For each new e-commerce customer on the site, XYZ’s staff needs to work for significant amounts of time creating a transformation application that would facilitate the exchange of information. But with better approach: XYZ needs a robust messaging solution that would provide the flexibility and agility to meet a range of customer needs quickly and effectively. Now again XYZ can think of integrating Customer Backend Systems with the help of business-to-business (B2B) platform.

 

Environment

Many large scale organizations maintain a centralized SAP environment as its core enterprise resource planning (ERP) system. The SAP system is used for the management and processing of all global business processes and practices. B2B integration mainly relies on the asynchronous messaging, Electronic Data Interchange (EDI) and XML document transformation mechanisms to facilitate the transformation and exchange of information between any ERP System and other applications including legacy systems.

For business document routing, transformation, and tracking, existing SAP-XML/EDI technology road map needs XML service engine. This will allow development of complex set of mappings from and to SAP to meet internal and external XML/EDI technology and business strategy. Microsoft BizTalk Server is the best choice to handle the data interchange and mapping requirements. BizTalk Server has the most comprehensive development and management support among business-to-business platforms. Microsoft BizTalk Server and BizTalk XML Framework version 2.0 with Simple Object Access Protocol (SOAP) version 1.1 provide precisely the kind of messaging solution that is needed to facilitate integration with cost effective manner.

 

Document Flow

Friends, now let’s look at the actual flow of document from Source System to Customer Target System using BizTalk Server. When a document is created, it is sent to a TCP/IP-based Application Linking and Enabling (ALE) port—a BizTalk-based receive function that is used for XML conversion. Then the document passes the XML to a processing script (VBScript) that is running as a BizTalk Application Integration Component (AIC). The following figure shows how BizTalk Server acts as a hub between applications that reside in two different organizations:

The data is serialized to the customer/vendor XML format using the Extensible Stylesheet Language Transformations (XSLT) generated from the BizTalk Mapper using a BizTalk channel. The XML document is sent using synchronous Hypertext Transfer Protocol Secure (HTTPS) or another requested transport protocol such as the Simple Mail Transfer Protocol (SMTP), as specified by the customer.

The following figure shows steps for XML document transformation:

The total serialized XML result is passed back to the processing script that is running as a BizTalk AIC. An XML “receipt” document then is created and submitted to another BizTalk channel that serializes the XML status document into a SAP IDOC status message. Finally, a Remote Function Call (RFC) is triggered to the SAP instance/client using a compiled C++/VB program to update the SAP IDOC status record. A complete loop of document reconciliation is achieved. If the status is not successful, an e-mail message is created and sent to one of the Support Teams that own the customer/vendor business XML/EDI transactions so that the conflict can be resolved. All of this happens instantaneously in a completely event-driven infrastructure between SAP and BizTalk.

Integration Steps

Let’s talk about a very popular Order Entry and tracking scenario while discussing integration hereafter. The following sections describe the high-level steps required to transmit order information from Order Processing pipeline Component into the SAP/R3 application, and to receive order status update information from the SAP/R3 application.

The integration of AFS purchase order reception with SAP is achieved using the BizTalk Adapter for SAP (BTS-SAP). The IDOC handler is used by the BizTalk Adapter to provide the transactional support for bridging tRFC (Transactional Remote Function Calls) to MSMQ DTC (Distributed Transaction Coordinator). The IDOC handler is a COM object that processes IDOC documents sent from SAP through the Com4ABAP service, and ensures their successful arrival at the appropriate MSMQ destination. The handler supports the methods defined by the SAP tRFC protocol. When integrating purchase order reception with the SAP/R3 application, BizTalk Server (BTS) provides the transformation and messaging functionality, and the BizTalk Adapter for SAP provides the transport and routing functionality.

The following two sequential steps indicate how the whole integration takes place:

  • Purchase order reception integration
  • Order Status Update Integration

Purchase Order Reception Integration

  1. Suppose a new pipeline component is added to the Order Processing pipeline. This component creates an XML document that is equivalent to the OrderForm object that is passed through the pipeline. This XML purchase order is in Commerce Server Order XML v1.0 format, and once created, is sent to a special Microsoft Message Queue (MSMQ) queue created specifically for this purpose.Writing the order from the pipeline to MSMQ:>

    The first step in sending order data to the SAP/R3 application involves building a new pipeline component to run within the Order Processing pipeline. This component must perform the following two tasks:

    A] Make an XML-formatted copy of the OrderForm object that is passing through the order processing pipeline. The GenerateXMLForDictionaryUsingSchema method of the DictionaryXMLTransforms object is used to create the copy.

    Private Function IPipelineComponent_Execute(ByVal objOrderForm As Object, _
        ByVal objContext As Object, ByVal lFlags As Long) As Long
    
    On Error GoTo ERROR_Execute
    
    Dim oXMLTransforms As Object
    Dim oXMLSchema As Object
    Dim oOrderFormXML As Object
    
    ' Return 1 for Success.
    IPipelineComponent_Execute = 1
    
    ' Create a DictionaryXMLTransforms object.
    Set oXMLTransforms = CreateObject("Commerce.DictionaryXMLTransforms")
    
    ' Create a PO schema object.
    Set oXMLSchema = oXMLTransforms.GetXMLFromFile(sSchemaLocation)
    
    ' Create an XML version of the order form.
    Set oOrderFormXML = oXMLTransforms.GenerateXMLForDictionaryUsingSchema_
        (objOrderForm, oXMLSchema)
    
    WritePO2MSMQ sQueueName, oOrderFormXML.xml, PO_TO_ERP_QUEUE_LABEL, _
        sBTSServerName, AFS_PO_MAXTIMETOREACHQUEUE
    
    Exit Function
    
    ERROR_Execute:
    App.LogEvent "QueuePO.CQueuePO -> Execute Error: " & _
    vbCrLf & Err.Description, vbLogEventTypeError
    
    ' Set warning level.
    IPipelineComponent_Execute = 2
    Resume Next
    
    End Function

    B] Send the newly created XML order document to the MSMQ queue defined for this purpose.

    Option Explicit
    
    ' MSMQ constants.
    
    ' Access modes.
    Const MQ_RECEIVE_ACCESS = 1
    Const MQ_SEND_ACCESS = 2
    Const MQ_PEEK_ACCESS = 32
    
    ' Sharing modes. Const MQ_DENY_NONE = 0
    Const MQ_DENY_RECEIVE_SHARE = 1
    
    ' Transaction options. Const MQ_NO_TRANSACTION = 0
    Const MQ_MTS_TRANSACTION = 1
    Const MQ_XA_TRANSACTION = 2
    Const MQ_SINGLE_MESSAGE = 3
    
    ' Error messages.
    Const MQ_ERROR_QUEUE_NOT_EXIST = -1072824317
    
    ' MQ Message ACKNOWLEDGEMENT.
    Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
    Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
    Const DEFAULT_MAX_TIME_TO_REACH_QUEUE = 20
    ' MQ Message ACKNOWLEDGEMENT.
    Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
    Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
    
    Function WritePO2MSMQ(sQueueName As String, sMsgBody As String, _
        sMsgLabel As String, sServerName As String, _
        Optional MaxTimeToReachQueue As Variant) As Long
    
    Dim lMaxTime As Long
    
    If IsMissing(MaxTimeToReachQueue) Then
    lMaxTime = DEFAULT_MAX_TIME_TO_REACH_QUEUE
    Else
    lMaxTime = MaxTimeToReachQueue
    End If
    
    Dim objQueueInfo As MSMQ.MSMQQueueInfo
    Dim objQueue As MSMQ.MSMQQueue, objAdminQueue As MSMQ.MSMQQueue
    Dim objQueueMsg As MSMQ.MSMQMessage
    
    On Error GoTo MSMQ_Error
    
    Set objQueueInfo = New MSMQ.MSMQQueueInfo
    objQueueInfo.FormatName = "DIRECT=OS:" & sServerName & "\PRIVATE$\" & sQueueName
    
    Set objQueue = objQueueInfo.Open(MQ_SEND_ACCESS, MQ_DENY_NONE)
    
    Set objQueueMsg = New MSMQ.MSMQMessage
    
    objQueueMsg.Label = sMsgLabel ' Set the message label property
    objQueueMsg.Body = sMsgBody ' Set the message body property
    objQueueMsg.Ack = MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE
    objQueueMsg.MaxTimeToReachQueue = lMaxTime
    
    objQueueMsg.send objQueue, MQ_SINGLE_MESSAGE
    
    objQueue.Close
    
    On Error Resume Next
    Set objQueueMsg = Nothing
    Set objQueue = Nothing
    Set objQueueInfo = Nothing
    
    Exit Function
    
    MSMQ_Error:
    App.LogEvent "Error in WritePO2MSMQ: " & Error
    Resume Next
    
    End Function
    
  2. A BTS MSMQ receive function picks up the document from the MSMQ queue and sends it to a BTS channel that has been configured for this purpose. Receiving the XML order from MSMQ: The second step in sending order data to the SAP/R3 application involves BTS receiving the order data from the MSMQ queue into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the MSMQ queue to which the XML order was sent in the previous step. This receive function forwards the XML message to the configured BTS channel for transformation.
  3. The third step in sending order data to the SAP/R3 application involves BTS transforming the order data from Commerce Server Order XML v1.0 format into ORDERS01 IDOC format. A BTS channel must be configured to perform this transformation. After the transformation is complete, the BTS channel sends the resulting ORDERS01 IDOC message to the corresponding BTS messaging port. The BTS messaging port is configured to send the transformed message to an MSMQ queue called the 840 Queue. Once the message is placed in this queue, the BizTalk Adapter for SAP is responsible for further processing. 
  4. BizTalk Adapter for SAP sends the ORDERS01document to the DCOM Connector (Get more information on DCOM Connector from www.sap.com/bapi), which writes the order to the SAP/R3 application. The DCOM Connector is an SAP software product that provides a mechanism to send data to, and receive data from, an SAP system. When an IDOC message is placed in the 840 Queue, the DOM Connector retrieves the message and sends it to SAP for processing. Although this processing is in the domain of the BizTalk Adapter for SAP, the steps involved are reviewed here as background information:
    • Determine the version of the IDOC schema in use and generate a BizTalk Server document specification.
    • Create a routing key from the contents of the Control Record of the IDOC schema.
    • Request a SAP Destination from the Manager Data Store given the constructed routing key.
    • Submit the IDOC message to the SAP System using the DCOM Connector 4.6D Submit functionality.

Order Status Update Integration

Order status update integration can be achieved by providing a mechanism for sending information about updates made within the SAP/R3 application back to the Commerce Server order system.

The following sequence of steps describes such a mechanism:

  1. BizTalk Adapter for SAP processing:
    After a user has updated a purchase order using the SAP client, and the IDOC has been submitted to the appropriate tRFC port, the BizTalk Adapter for SAP uses the DCOM connector to send the resulting information to the 840 Queue, packaged as an ORDERS01 IDOC message. The 840 Queue is an MSMQ queue into which the BizTalk Adapter for SAP places IDOC messages so that they can be retrieved and processed by interested parties. This process is within the domain of the BizTalk Adapter for SAP, and is used by this solution to achieve the order update integration.
  2. Receiving the ORDERS01 IDOC message from MSMQ:
    The second step in updating order status from the SAP/R3 application involves BTS receiving ORDERS01 IDOC message from the MSMQ queue (840 Queue) into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the 840 Queue into which the XML order status message was placed. This receive function must be configured to forward the XML message to the configured BTS channel for transformation.
  3. Transforming the order update from IDOC format:
    Using a BTS MSMQ receive function, the document is retrieved and passed to a BTS transformation channel. The BTS channel transforms the ORDERS01 IDOC message into Commerce Server Order XML v1.0 format, and then forwards it to the corresponding BTS messaging port. You must configure a BTS channel to perform this transformation.The following BizTalk Server (BTS) map demonstrates in the prototyping of this solution for transforming an SAP ORDERS01 IDOC message into an XML document in Commerce Server Order XML v1.0 format. It allows a change to an order in the SAP/R3 application to be reflected in the Commerce Server orders database.

    This map used in the prototype only maps the order ID, demonstrating how the order in the SAP/R3 application can be synchronized with the order in the Commerce Server orders database. The mapping of other fields is specific to a particular implementation, and was not done for the prototype.

< xsl:stylesheet xmlns:xsl='http://www.w3.org/1999/XSL/Transform' 
xmlns:msxsl='urn:schemas-microsoft-com:xslt' xmlns:var='urn:var' 
xmlns:user='urn:user' exclude-result-prefixes='msxsl var user' 
version='1.0'>
< xsl:output method='xml' omit-xml-declaration='yes' />
< xsl:template match='/'>
< xsl:apply-templates select='ORDERS01'/>
< /xsl:template>
< xsl:template match='ORDERS01'>
< orderform>

'Connection from source node "BELNR" to destination node "OrderID"

< xsl:if test='E2EDK02/@BELNR'>
< xsl:attribute name='OrderID'>
; < xsl:value-of select='E2EDK02/@BELNR'/>
< /xsl:attribute>
< /xsl:if>
< /orderform>
< /xsl:template>
< /xsl:stylesheet>

The BTS message port posts the transformed order update document to the configured ASP page for further processing. The configured ASP page retrieves the message posted to it and uses the Commerce Server OrderGroupManager and OrderGroup objects to update the order status information in the Commerce Server orders database.

  • Updating the Commerce Server order system:
    The fourth step in updating order status from the SAP/R3 application involves updating the Commerce Server order system to reflect the change in status. This is accomplished by adding the page _OrderStatusUpdate.asp to the AFS Solution Site and configuring the BTS messaging port to post the transformed XML document to that page. The update is performed using the Commerce Server OrderGroupManager and OrderGroup objects.
  •  

    The routine ProcessOrderStatus is the primary routine in the page. It uses the DOM and XPath to extract enough information to find the appropriate order using the OrderGroupManager object. Once the correct order is located, it is loaded into an OrderGroup object so that any of the entries in the OrderGroup object can be updated as needed.

    The following code implements page _OrderStatusUpdate.asp:

    < %@ Language="VBScript" %>
    
    < % 
    const TEMPORARY_FOLDER = 2
    
    call Main()
    
    Sub Main()
    call ProcessOrderStatus( ParseRequestForm() )
    End Sub
    
    Sub ProcessOrderStatus(sDocument)
    
    Dim oOrderGroupMgr 
    Dim oOrderGroup 
    Dim rs
    Dim sPONum
    Dim oAttr 
    Dim vResult
    Dim vTracking 
    Dim oXML
    Dim dictConfig
    Dim oElement
    
    Set oOrderGroupMgr = Server.CreateObject("CS_Req.OrderGroupManager")
    Set oOrderGroup = Server.CreateObject("CS_Req.OrderGroup")
    
    Set oXML = Server.CreateObject("MSXML.DOMDocument")
    oXML.async = False
    
    If oXML.loadXML (sDocument) Then
    
    ' Get the orderform element.
    Set oElement = oXML.selectSingleNode("/orderform")
    
    ' Get the poNum.
    sPONum = oElement.getAttribute("OrderID")
    
    Set dictConfig = Application("MSCSAppConfig").GetOptionsDictionary("")
    
    ' Use ordergroupmgr to find the order by OrderID.
    oOrderGroupMgr.Initialize (dictConfig.s_CatalogConnectionString)
    Set rs = oOrderGroupMgr.Find(Array("order_requisition_number='" sPONum & "'"), _
        Array(""), Array(""))
    
    If rs.EOF And rs.BOF Then
    'Create a new one. - Not implemented in this version.
    Else
    ' Edit the current one.
    oOrderGroup.Initialize dictConfig.s_CatalogConnectionString, rs("User_ID")
    
    ' Load the found order.
    oOrderGroup.LoadOrder rs("ordergroup_id")
    
    ' For the purposes of prototype, we only update the status
    oOrderGroup.Value.order_status_code = 2 ' 2 = Saved order
    
    ' Save it
    vResult = oOrderGroup.SaveAsOrder(vTracking)
    
    End If
    Else
    WriteError "Unable to load received XML into DOM."
    End If
    
    End Sub Function ParseRequestForm()
    
    Dim PostedDocument
    Dim ContentType
    Dim CharSet
    Dim EntityBody
    Dim Stream
    Dim StartPos
    Dim EndPos
    
    ContentType = Request.ServerVariables( "CONTENT_TYPE" )
    
    ' Determine request entity body character set (default to us-ascii).
    CharSet = "us-ascii"
    StartPos = InStr( 1, ContentType, "CharSet=""", 1)
    If (StartPos > 0 ) then
    StartPos = StartPos + Len("CharSet=""")
    EndPos = InStr( StartPos, ContentType, """",1 )
    CharSet = Mid (ContentType, StartPos, EndPos - StartPos )
    End If
    
    ' Check for multipart MIME message.
    PostedDocument = ""
    
    if ( ContentType = "" or Request.TotalBytes = 0) then
    
    ' Content-Type is required as well as an entity body.
    Response.Status = "406 Not Acceptable"
    Response.Write "Content-type or Entity body is missing" & VbCrlf
    Response.Write "Message headers follow below:" & VbCrlf
    Response.Write Request.ServerVariables("ALL_RAW") & VbCrlf
    Response.End
    Else
    If ( InStr( 1,ContentType,"multipart/" ) >

    .NET Support

    This Multi-Tier Application Environment can be implemented successfully with the help of Web portal which utilizes the Microsoft .NET Enterprise Server model. The Microsoft BizTalk Server Toolkit for Microsoft .NET provides the ability to leverage the power of XML Web services and Visual Studio .NET to build dynamic, transaction-based, fault-tolerant systems with full access to existing applications.

    Summary

    Microsoft BizTalk Server can help organizations quickly establish and manage Internet relationships with other organizations. It makes it possible for them to automate document interchange with any other organization, regardless of the conversion requirements and data formats used. This provides a cost-effective approach for integrating business processes across large Enterprises Resource Planning Systems. Integration process designed to facilitate collaborative e-commerce business processes. The process includes a document interchange engine, a business process execution engine, and a set of business document and server management tools. In addition, a business document editor and mapper tools are provided for managing trading partner relationships, administering server clusters, and tracking transactions.

    References

    Power BI connectivity to SAP BusinessObjects BI

    Microsoft and SAP are jointly delivering business intelligence (BI) interoperability in Microsoft Excel, Microsoft Power BI for Office 365, and SAP BusinessObjects BI.

     

    Microsoft Power Query for Excel seamlessly connects to SAP BusinessObjects BI Universes enabling users to access and analyze data across the enterprise and share their data and insights through Power BI.

    This connectivity drives a single version of truth, instant productivity, and optimized business performance for your organization.

    Image

    Download Microsoft Power Query Preview for Excel

    Preview contains SAP BusinessObjects BI Universe connectivity.

    Details

    Microsoft Power Query Preview for Excel, providing SAP BusinessObjects BI Universe connectivity, is an add-in that provides a seamless experience for data discovery, data transformation and enrichment for Information Workers, BI professionals and other Excel users. This preview provides an early look into the upcoming SAP BusinessObjects BI Universe connectivity feature. As with most previews, this feature may appear differently in the final product.

    System Requirements

    Supported operating systems

    Windows 7, Windows 8, Windows 8.1, Windows Server 2008, Windows Vista

    • Windows Vista (requires .NET 3.5SP1)
    • Windows Server 2008 (requires .NET 3.5 SP1)
    • Windows 7
    • Windows 8
    • Windows 8.1

    The following Office versions are supported:

    • Microsoft Office 2010 Professional Plus with Software Assurance
    • Microsoft Office 2013 Professional Plus, Office 365 ProPlus or Excel 2013 Standalone

    Microsoft Power Query Preview for Excel requires Internet Explorer 9 or greater.

    Microsoft Power Query Preview for Excel is available for 32-bit (x86) and 64-bit (x64) platforms, your selection must match architecture of the installed version of Office.

    Installation Instructions

    Download the version of the Power Query add-in that matches the architecture (x86 or x64) of your Office installation. Run the MSI installer and follow the setup steps.

    Access and analyze your trusted enterprise data

    Learn how Microsoft and SAP deliver a combination of trusted enterprise data and familiar market leading tools.

    Single version of truth

    Deliver the latest, accurate and trusted data from across the enterprise, such as from SAP applications, directly into the hands of users in Microsoft Excel. They no longer need to constantly copy and paste or import data using a manual process leading to inaccuracy. Users can instead focus on leveraging their knowledge to analyze data from within and outside your organization. They can get answers and uncover new insights to better deal with the challenges facing your organization, eliminating costly decisions based on inaccurate data.

    Instant productivity

    Users can continue to work in their familiar Microsoft Excel environment with access to business friendly terms from SAP BusinessObjects BI Universes at their fingertips, allowing for deeper analysis on their own. Using familiar tools enables them to easily integrate data and insights into existing workflows without the need to learn new complex tools and skills. Any uncovered data and insights can be kept up to date with no hassle refreshing from on-premises and the cloud, increasing productivity.

    Optimized business performance

    Leveraging existing investments from both companies together enables your organization to unlock insights faster and react accordingly. Your organization can identify patterns, cost drivers, and opportunities for savings in an agile, accurate, and visual manner. Specific trends and goals can be measured while having visually attractive and up to date dashboards. Relying on trusted enterprise data reduces costs and increases profitability by allowing faster, better, and timelier decisions. All of this drives broader BI adoption to create an information driven culture across your organization.

    Unlocked data and insights

    Drive trusted enterprise data and insights from an SAP BusinessObjects BI Universe throughout your organization by sharing and collaborating with Power BI from anywhere. Anyone can create a collaborative BI site to share data and insights relying on the latest data from either on-premises or the cloud using scheduled refreshing. Users no longer have to struggle to think up and answer every single question in advance, instead interactively investigating data when they need it through natural language Q&A. Better yet, they can stay connected with mobile access to data and insights generating a deeper understanding of the business and communicating more effectively from anywhere.

     

     

    Duet Enterprise and NetWeaver – Feautures of and Benefits for businesses by integrating SAP with SharePoint

    ImageImage

    For years organisations have been scratching their heads trying to figure out how to provide collaboration capabilities on data held within SAP systems.

    Many have developed custom solutions and achieved mixed results. We have also seen the rise of Duet 1.x from Microsoft and SAP that provided 11 specific solutions that surfaced data residing in SAP via Microsoft Office. These were good solutions but limited due to their lack of extensibility.

    Hence the excitement over Duet Enterprise and the benefits that have been realised are exactly what everyone has been looking for. InfoSys Technologies have just released a case study with Microsoft that discusses their Duet Enterprise project and it is an interesting read. Particularly the benefits realised:

    1. Minimal Development Effort
    2. Higher Adoption

    3. Enhanced Productivity

    How can these benefits be real?

    Well, to start with, Duet Enterprise provides all the relevant plumbing to blend both SAP and SharePoint “Platforms”. SAP and Microsoft position Duet Enterprise as the “Foundation” layer between the two platforms.

    DE Architecture

    Architecturally speaking the Duet Enterprise Add-on’s must be installed on SAP Netweaver 7.02 Servers and the SharePoint 2010 Farm. The SAP Netweaver 7.02 Servers work as the gateway to the SAP backend systems and can actually support any version of SAP system. The Duet Enterprise is built on SharePoint 2010 and makes use of the Business Connectivity Services that enables the full Create Read Update Delete (CRUD) operations to data residing on external systems. The options for user experience are the same for any SharePoint 2010 solution; Mobile, Office or a Browser. There are no SAP or Duet Enterprise clients.

    Ok, but what does this do?

    This provides organisations with a jointly supported, (Microsoft and SAP) technical approach and OOB tools to address the common challenges faced when integrating SAP systems. Such as;

    Authentication: Using Claims-Based authentication to ensure SSO is available between SharePoint 2010 sites and SAP backend systems.

    Authorisation: Able to secure objects in SharePoint using SAP Roles, supporting concepts such a Manager seeing more than an employee.

    Monitoring: Duet Enterprise SharePoint Health Rules are provided to proactively monitor your Duet Enterprise solutions. Also available are Duet Enterprise Management Pack for System Centre Operations Manager, a quick video here shows it in action.

    It is also worth taking a look at the Duet Enterprise Architecture for more information.

    These are just some of the behind the scenes aspects of Duet Enterprise that essentially provide a jump start in any SAP/SharePoint integration project. However, there is also another layer that exemplifies how data from SAP can be surfaced through SharePoint 2010 and Office 2010. (Office 2010 isn’t a pre-req but certainly a better experience!).

    Currently included are;

    1. Duet Enterprise Workflow
  • Duet Enterprise Profile

  • Duet Enterprise Collaboration

  • Duet Enterprise Sites

  • Duet Enterprise Reporting

  • All of this provide a way to jump start development of a solution that can be used OOB or easily extended to suit specific requirements. The result here is that organisations are able to surface data typically locked away in backend SAP systems to a much wider audience and in an inexpensive way. I may blog on what each of these are in the future and how to extend them.

    Once deployed, IT departments have a “Foundation” in place to support future extensibility. Once users start seeing what is possible I fully expect the flood gates to open with requests for composite applications.

    What else do you get?

    1. Long-term product roadmap commitment from SAP and Microsoft
    2. Platform for innovation: broad ecosystem of ISV and service partners offering Business Pack Solutions
    3. Partner solutions/apps certified by SAP

    Example of how to use the SAP NetWeaver Gateway in building a Cloud App

    Overview

    SAP provides a tool called SAP NetWeaver Gateway that enables the ability to expose SAP application data as an OData service. This OData service can then be used by a CBA to create custom line of business apps. SAP has several sample gateway services you can use for testing and app building. For our example, we will use the SAP Enterprise Procurement Model (EPM) service. Read the SAP documentation to learn how to access to the EPM service and other sample services from SAP. Be aware that these sample services are read-only; however, NetWeaver Gateway does support read-write services.

    Our SAP CBA app will be based on a fictional company that sells computers and accessories. This company has several locations worldwide, including a distribution branch that we will be building a line of business app for, named Contoso Shipping Management. Specifically, our app will help the branch manager of Contoso Shipping Management with their daily tasks. The branch manager routinely views product information in the system and adds supplemental production information that is specific to their branch (such as the item location and whether items are out of stock).

    Define the data model

    Begin by creating a CBA app; in Visual Studio. Choose the Cloud Business App project template under the Office/SharePoint>Apps node.

    Attach to SAP Data Source

    When you have created the app, attach it to the SAP service.

    1. In the Server Explorer, under the Server project, choose the Data Sources>Add Data Source.
    2. In the Attach Data Source Wizard, notice the option to select SAP as a data source; after selecting it, choose Next.
      clip_image001 Figure 1. Select SAP in the Attach Data Source Wizard
    3. On the Enter Connection Information page, enter the URL to SAP EPM service along with the credentials that you received after signing up for access to their test feeds; choose Next. Although, it is possible to select None for the authentication type, typically SAP feeds are configured to require authentication (CBA apps currently support connecting to SAP using basic authentication). For more information, see the Authentication section listed at the end of this post.
      clip_image003 Figure 2. Enter connection information in the Attach Data Source Wizard
    4. On the Choose your Entities page, select the BusinessPartner and Product entities and rename the data source to SAP_EPM_Service; choose Finish.
      clip_image005 Figure 3. Select the BusinessPartner and Product entities in the Attach Data Source Wizard

    As a result, you will now see the SAP_EPM_Service added as a data source to your Server project including the BusinessPartner and Product entities that you selected.

    clip_image007 Figure 4. Entity Designer showing the Product entity

    You can use the ctrl + up arrow\down arrow keys to change the order of the properties. It is useful to define the desired order on the entity, so that later when screens are created, the fields on the screen will automatically be added in the same order. For example, you may change the order of the properties so that ProductId, Name, ProductURL, and Description appear first.

    One feature of an SAP data source within a CBA is the recognition of certain SAP-specific annotations that can adorn entity properties within the service. Specifically, the annotations that will be recognized by a CBA are those that have the sap:semantics value set to “email”, “tel”, or “url”. The BusinessPartner entity that was selected in the Attach Data Source Wizard happens to have properties that demonstrate all three of these annotations. You can view the semantic annotations by viewing the $metadata from the SAP feed.

    https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV/$metadata

    clip_image008

    Viewing the BusinessPartner entity in the Entity Designer, observe that the EmailAddress, PhoneNumber, and WebAddress fields have their respective types set to the Email Address, Phone Number, and Web Address business types.

    clip_image010 Figure 5. Properties on the BusinessPartner entity have been set to the appropriate business type

    Extend the Product Entity Properties

    For our line of business app, we need to track some additional product information that is specific to our branch, Contoso Shipping Management. With a CBA, we can easily extend any entity properties by relating data from the internal database of the app with data from an external data source. Furthermore, CBAs support relating data between external data sources, such as SharePoint\Office 365 and SAP.

    Add a Relationship

    In our example, we would like to track two additional pieces of product information: the item location and whether a product is out of stock. First, we need to add an entity to the internal database of our app that will be used to store this additional information. Second, we will relate this entity to the Product entity (that exists in the SAP data source) using a one-to-one or zero-to-one relationship.

    1. To add an entity in the internal database, choose the Data Sources node in the Solution Explorer and select Add Table.
    2. Rename the table to ProductDetail and add the following properties: OutOfStock and BackroomLocation. Clear the Required check box for these properties.
      clip_image011 Figure 6. The ProductDetail entity
    3. Add a relationship between Product and ProductDetail. To do this, open Product in the entity designer and choose the Add: Relationship button.
      clip_image013 Figure 7. Adding a relationship
    4. In the Add New Relationship dialog box, add a relationship so that each Product can have one ProductDetail (and a ProductDetail must have a Product). Choose OK to close the dialog box.
      clip_image014 Figure 8. Configuring the relationship

    Create the client screens

    Now that the data model is defined, add some screens to the app. While working with the screens, remember that the sample SAP service that we are using is read-only. As a result, the only data that we can edit is the ProductDetail entity because it is stored in the internal database of the app. If instead we were using a read-write SAP service, we would be able to edit all of the information on these screens and automatically save the data back to SAP.

    Create the Common Screen Set

    1. Choose Screens node in the Solution Explorer and select Add Screen.
    2. In the Add New Screen dialog box, select the Common Screen Set template and set the Screen Data to SAP_EPM_Service.ProductCollection. Finally, choose OK to close the dialog box.
      clip_image016 Figure 9. Adding a new Common Screen Set

      As a result, you will now see a ProductCollection folder created in the SolutionExplorer that contains three screens: AddEditProduct, BrowseProductCollection, and ViewProduct.

    3. Since we defined a one-to-one or zero-to-one relationship between Product and ProductDetail, we need to add code that automatically creates a new ProductDetail entity when the AddEditProduct screen is opened. Choose the AddEditProduct screen from the Solution Explorer, choose the Write Code drop-down in the Screen Designer toolbar and choose created.
      clip_image018 Figure 10. Writing “created” code on the AddEditProduct screen

      To create a ProductDetail instance for this Product instance, use the following code.

      myapp.AddEditProduct.created = function (screen) {
        if (!screen.Product.ProductDetail) {
          var productDetail =        myapp.activeDataWorkspace.ApplicationData.ProductDetails.addNew();
          productDetail.Product = screen.Product;
        }
      };

      Now when the AddEditProduct screen is opened, a related ProductDetail is created, if one does not already exist.

    4. To make the fields that were added from the ProductDetail entity more prevalent on the screen, move the Out of Stock and Backroom Location controls to the top of the right Rows Layout group on this screen. Do the same on the ViewProduct screen as well. Notice that you can also change other appearance properties on controls, such as the font. There are many other properties that you can set on the screen designer to customize the screens.
    5. Also on the ViewProduct screen, drag out the Supplier field under our ProductDetail controls to show data from the BusinessPartner entity. This will create a group called Supplier (with properties from the related BusinessPartner entity). For this example, remove all the fields from this group except Email Address, Phone Number, and Web Address. Notice that these controls appear respectively as an Email Viewer, Phone Viewer, and Web Address Viewer (due to the annotations feature described earlier).
      clip_image020 Figure 11. Control layout
    6. Finally, we want to display the Product images on the ViewProduct screen. First change the Product Pic Url control to be an Image control. To do this, choose the ViewProduct screen in the Solution Explorer, then find the Product Pic Url control on the screen and change it from a Text control to an Image control.
      clip_image022 Figure 12. Changing Product Pic Url to an Image control

      Because this SAP service stores the image URLs in a relative format, we need to write more code to set the full URL to the image. With the Product Pic Url control still selected, choose the Edit PostRender Code link in the Properties window.

      clip_image024 Figure 13. Properties of the Product Pic Url control

      Add the following code to the PostRender method.

      myapp.ViewProduct.ProductPicUrl_postRender =  function (element, contentItem) {
        // add the URL of our SAP server to the relative ProductPicUrl
        var totalUri = "https://sapes1.sapdevcenter.com" + contentItem.value;
        $(element).find("img").attr("src", totalUri);
      };

    Run the app

    Now, run the app (press F5).

    If prompted, enter your SharePoint credentials. When the app starts, also choose Trust It (if prompted).

    Notice the following:

    • The app home screen allows you to browse Product data from the attached SAP data source.
    • When you choose a Product, the detailed Product information is displayed—this includes fields from both the attached SAP data source and the fields defined on the intrinsic ProductDetails entity.
      clip_image026 Figure 14. The ViewProduct screen
    • On the ViewProduct screen, you can choose the Edit button to open the AddEditProduct screen. Since the particular SAP service that the app is accessing is a read-only service, the fields defined on the SAP Product entity cannot be updated, but those defined on ProductDetail can be. If your app is accessing a read-only service, it is a good idea to remove the Add button on the ViewProduct screen and make the controls on the AddEditProduct screen “view” controls instead of “edit” controls.
      clip_image028 Figure 15. The AddEditProduct screen

    Additional notes

    Authentication

    SAP can be configured with a variety of authentication providers. For this release, we support HTTP Basic authentication. Basic authentication is enabled on most NetWeaver Gateway installations, and is easy to configure in both test and production. If you find that CBAs aren’t working with your SAP environment, let us know how we can support you better in the future.

    For our debut of SAP support, we wanted to enable a complete read and write scenario with SAP data. Therefore, in addition to basic authentication support, we’ve also implemented the session and CSRF token handling that SAP requires to be able to modify SAP data via the NetWeaver Gateway. This means that your CBAs will be able to write changes back to SAP if your SAP feeds support it. Don’t worry—when you attach to SAP data, we negotiate the SAP sessions and tokens automatically. There is nothing for you to configure.

    Non-addressable entities

    If your data fails to load and you see a diagnostics error similar to the following, it’s because you’re attempting to navigate directly to an SAP entity that is non-addressable.

    Error: The SalesOrderLineItemCollection is not addressable. Please use the Navigation Property via the SalesOrder Collection or Entity.

    For example, in the EPM service, the SalesOrderLineItemCollection entity set is marked as non-addressable which is evident in the service root document (for example, https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV).

    clip_image030

    This means that SalesOrderLineItemCollection is a child of SalesOrderCollection and that you can only access it by navigating to it through the parent.

    To solve this, be sure that you do not have a Browse Data Screen that is bound directly to a non-addressable entity (for example, SalesOrderLineItem). Instead, bind the Browse Data Screen to the parent (for example, SalesOrder) and create a View Details Screen that displays the SalesOrderLineItems data for the selected SalesOrder. This is done for you if you use the Common Screen Set template and select the parent as the primary entity for the Browse Data Screen.

    Free integration guide -Microsoft Dynamics CRM Online and Office 365

    Combining the online services of Office 365 with Microsoft Dynamics CRM Online empowers your teams to work where and when they want with best-of-class cloud services.

    This guide is intended for Microsoft Dynamics CRM administrators and technical decision makers interested in exploring Office 365 services and how they integrate with Microsoft Dynamics CRM Online. Integration with Office 365 becomes increasingly relevant to

    Microsoft Dynamics CRM Online users as management of Microsoft Dynamics CRM Online shifts to the Microsoft online services environment.

    For a .pdf version of this document: Integration Guide: Microsoft Dynamics CRM Online and Office 365 please visit – http://download.microsoft.com/download/D/4/F/D4F5A3C3-E3CB-48C9-85DE-4ED0B7FFBD60/CRMO365Integration.pdf

    Some of what this paper covers:

    • Add an Office 365 trial subscription to Microsoft Dynamics CRM Online
    • Set up CRM Online to use Exchange Online
    • Set up CRM Online to use SharePoint Online
    • Set up CRM Online to use Lync Online
    • Set up CRM Online to  use Yammer

    Why integrate SAP with SharePoint? Find out why!

    Mobile First…

    Probably everyone has heard of “Mobile first”. This concept is not only something that SAP is promoting (Going Mobile at SAP), but also partners, customer and analysts like Gartner  or Forbes are clearly supporting this strategy.

    The concept and idea are very clear: if you design applications to run on a mobile device with its small screen, then everything — the whole user interfacce — has to be simple and intuitive.

    One of the most prominent examples of this strategy is SAP Fiori. Based on SAPUI5 beautiful and intuitive applications are created and extremely well accepted by our customers.

     Image

    … does not mean mobile only

    In a customer presentation that I attended some time ago I also heard this feedback. SAP Fiori is an extremly powerful step by SAP to address workers across devices: whether they work on a mobile, tablet or desktop. However, the customer continued saying:

    “Mobile is sexy, but only 5% of our workers are using mobile devices (for their jobs). The majority of my users are still using a desktop”.

    I thought of this as a very interesting statement. Obviously I am also using a mobile device, but looking at a “Day in my life”, I start with using my mobile device, but then I use lots of other different technologies to interact with SAP data — and in a lot of cases these different technologies are powered somehow by Microsoft.

    “Don’t forget the desktop”

    DayInALife.png

    Still the data that I am accessing when working on my desktop is very often the same as the data which I need to access from my mobile device. So I would need to get SAP data in the tools that are running on my desktop.  And the desktop still is dominated in great numbers by Microsoft. Depending on which numbers you want to believe Windows is installed on 70% to 90% of desktops and Microsoft Office is running on over 1 billion devices world wide and is used in most organizations:

    OfficeUsage.png

    Having this in mind it is clear that “Mobile first does not mean mobile only

    Why don’t we take the concept and ideas — and actually the applications and scenarios — and make them available on the Microsoft desktop?

    Using SAP data from within Microsoft Office

    When you get a Purchase Order or Credit Memo and you are sitting in front of your Microsoft Outlook already writing an email — wouldn’t it be great to get this workflow directly as a Task in Outlook and be able to approve it from there?

    MicrosoftOutlook-Approval.png

    For the business partners, that you just worked with on your tablet in you SAP Fiori application  — wouldn’t it be beneficial to have the very same contacts in your Microsoft Outlook?

    MicrosoftOutlook-Contacts.png

    Similar with the tracked time that you just viewed in your SAP Fiori My Timesheet app on your mobile device — wouldn’t it be good to have these dates also in your Microsoft Outlook Calendar?

    GWPAM+Fiori.png

    94% of companies use Microsoft Office

    Similar to Microsoft Windows, Microsoft Office is still the dominant productivity suite at our customers. Not necessarily always the latest and greatest version, but definitely Microsoft Office. So in addition to having the power and flexibility of SAP Fiori an integration in Microsoft Office would absolutely make sense. We could not only offer an additional user interface by bringing SAP data to known and used Microsoft UIs, but also improve user productivity by keeping the end-user in the UI that they are currenlty working with.

    Going beyond these existing scenarios it can also make a lot of sense to empower the users to interact with data from SAP in a way they are already used to. When you think of mass data manipulation probably Microsoft Excel comes into your mind. So again this Microsoft based UI / tool could be the perfect choice for the end-users to work with data from SAP.

    Excel.png

    The challenges in integrating SAP to MSFT world

    All these integrations have one fundamental problem. You have to make sure that the integration, the interoperability between these known applications and data from SAP is easy, quick and secure. A recent study by IDC stated that more and more hobby developers build applications. Developing applications is getting easier and easier with impressive and powerful tools such as Visual Studio. However, developing and designing applications for life can be complicated. Gartner looked at the overall total cost of ownership of such applications. A very important factor to the total cost of ownership is the very first design of the application. How easily is the data consumed? How flexible is the application? How are security concerns like SSO, tracing, monitoring and scalability aspects handled?

    Keeping the hobby developers in mind and also surveys that show that only very few developers have security and “enterprise ready” know-how this is an issue.

    Nexus of forces

    The “nexus of forces” which Gartner started to highlight some time ago is a similar example of this. To quote:

    “The Nexus of Forces is the convergence and mutual reinforcement of social, mobility, cloud and information patterns that drive new business scenarios. Although these forces are innovative and disruptive on their own; together they are revolutionizing business and society, disrupting old business models and creating new leaders. The Nexus is the basis of the technology platform of the future.”

    from http://www.gartner.com/technology/research/nexus-of-forces/

    This technology platform needs a basis that not only addressed these forces, but also makes sure this is done in a secure and enterprise compliant / ready manner.

    SAP NetWeaver Gateway productivity accelerator for Microsoft!

    Luckily this is exactly what SAP NetWeaver Gateway productivity accelerator for Microsoft addresses with the interoperability framework between SAP and Microsoft. The Microsoft developer can stick to their knowledge. They can build applications for Microsoft Outlook, for Excel, for any .NET based application. They can be hobby developers from the business who “just” leverage templates and tools from Visual Studio to build beautiful applications — SAP NetWeaver Gateway productivity accelerator for Microsoft takes care of the interoperability and enterprise ready aspects.

    It gives IT the confidence and security that SAP NetWeaver Gateway productivity accelerator for Microsoft based applications adhere to certain design-patterns, leverage company wide security and supportability requirements. And help applications to be scalable from the very first POC for a few business users to the final roll-out to 10,000 users.

    With this framework the business is all of sudden empowered to respond to critical changes in the workplace at their own speed. They don’t have to rely on IT to help — because IT has already provided the required framework to make these changes happen.

    Optimizing the investments in Microsoft and SAP by providing a bridge that brings the two worlds together SAP NetWeaver Gateway productivity accelerator for Microsoft boosts workplace agility. This engagement initiative provides the business the flexibility they need to leverage existing skills, while still complying with corporate and IT regulations.

    How to successfully make the 2 worlds of SAP and SharePoint meet

    In essence, a ‘Duet Enterprise custom scenario’ development project is not different from other application development projects.

    The customer – either the real end-user, or product management for package software -, has an idea about what to achieve and functional + non-functional (quality) requirements to underline the idea.

    Often this idea is still vague, and the requirement set far from complete and on aspects also contradictionary.

    The waterfall approach to yet start with application development (specify, build, test) typically ends up with wrong or failed end-results: the application does not live up to the real customer needs and expectations, and the budget is largely overrun as consequence of responding on the requirements changes.

    The agile movement came up to improve on this bad practice. Central is the structural involvement of the user (representative) during the full course of the development project.

    Continuous, the user is in the lead about the specifics and the behaviour of the application functionality to be delivered.

    In Scrum, this responsibility is formalized within the product owner role.

    Application mockup
    A proven approach to enable the product owner to validate, correct, and ultimately confirm the application functionality is by make it experience, ‘touchable’.

    Provide the product owner with a realistic impression of the application behavour + feature(s) that the project will build: a clickable prototype / mockup.

    Based on multiple positive experiences (also within non-Duet Enterprise project, e.g. BI Dashboards), we strongly believe in the added value and power of application mockup. We regard the role of UX / UID expert crucial for customer involvement, and thus for delivering the correct application functionality.

    Meeting of 2 worlds

    Yet, a Duet Enterprise project also is very much different from ‘normal’ projects.
    The cause lies with the meeting of 2 very different environments: IT technology, platform concepts, and human nature.

    The tradional SAP environment is transaction oriented, with focus on structural business process executions.

    The Microsoft environment, and in particular SharePoint, is far more loose. Ad-hoc processes are the norm, the individual user decides how to perform a specific (business) action. When these 2 worlds + thinking come together in a Duet Enterprise project, the first outcomes are often dramatic.

    SAP versus SharePoint architects,
    business analysts and developers do not understand each other (concepts).
    Both ‘parties’ disagree on how and where (that is, on SAP or on SharePoint side) custom development should be done.

    The personal preference on how to achieve the SAP+SharePoint integration is influenced by one’s own technology frames and familiair concepts.

    This has the risk that the user is forgotten: the user interests must be central, deliver an optimal user experience.

    Align through Blueprint Duet Enterprise scenario’s

    To break through this, we utilize in our Duet Enterprise development process a blueprint step with the focus exclusive on the Duet Enterprise specific scenarios.

    This blueprint is derived and agreed by a mixed group of SAP and SharePoint representatives. The goal of this step, and its endresult, is to come to a mutual understanding of the solution direction for each of the Duet Enterprise scenario’s.

    Self-employed constraints are to stick as close to the standards of both the SAP business package and SharePoint platform. Where feasible, we do allow custom development.

    Here we apply Duet Enterprise integration patterns that we have identified and defined: Consumer-oriented, Provider-oriented, and XML-Funneling pattern.

    Of course, the discussion and preferences about where to make custom adjustments still manifest within the Duet Enterprise blueprinting step.

    However, as it now has the full focus of all involved, mutual agreement is earlier and more effective reached. This is the so-called ‘workshop effect’.

    Also, in this focused approach, comparable application situations are easier to recognize. And where so identified, the earlier agreed solution approach can be reused.

    Starting point for the blueprint derivation is the User Interface Design as validated and confirmed by the product owner.
    In the blueprint we derive and define per identified Duet Enterprise scenario the following pieces:

    Global description of the scenario: screenshot of the mockup screen and textual explanation. The purpose is to facilitate on high level a common understanding of the functionality in the scenario.

    There is no intend to compete with or replace use cases.
    Derivation, explanation and justification of solution direction. At minimal explanation of the proposed solution direction.

    But whenever applicable, also the rationale of why an alternative solution direction is dropped.

    SAP building blocks: standard RFC’s, BAPIs; identification of needed custom SAP building blocks

    
    Blueprint validation by Design Authority / Authorities

    Justified mantra nowadays within SAP and Microsoft IT departments is to avoid unnecessary or otherwise avoidable custom development. Instead stick close to the standard delivered functionality of both landscapes.

    It is also evident that fully excluding custom development is not feasible when you are developing company custom scenario’s.

    We facilitate the consistency and continuity in the Duet Enterprise solution directions by confirming the blueprint to Duet Enterprise Rules, Conventions and Guidelines.

    The Design Authorities of both SAP and Microsoft IT departments validate the blueprint against these agreements. Derivation can be allowed, but requires the formal approved of the respective Design Authority (SAP, Microsoft, and in some cases both).

    SAP + SharePoint realizations based on the blueprint

    With the blueprint accepted and delivered, from here on SAP and SharePoint teams in the project can start with their own development activities for the Duet Enterprise scenario’s.

    Since representatives of the teams were involved in deriving the blueprint, the concrete software design + development can be done relatively independent of the other party.

    Of course the data contract details of each SAP / Duet Enterprise interface are input for the SharePoint consumption.

    Regular sharing the SAP design with the SharePoint developers can achieve this.