Category Archives: SAP

How To : Implement Business Data Connectivity in SharePoint 2013

Business Data Connectivity

Business Connectivity Services is a centralized infrastructure in SharePoint 2013 and Office 2013 that supports integrated data solutions. With Business Connectivity Services, you can use SharePoint 2013 and Office 2013 clients as interfaces into data that doesn’t live in SharePoint 2013 itself. For example, this external data may be in a database and it is accessed by using the out-of-the-box Business Connectivity Services connector for that database.

DuetEnterpriseDesign[1]

Business Connectivity Services can also connect to data that is available through a web service, or data that is published as an OData source or many other types of external data. Business Connectivity Services does this through out-of-the box or custom connectors.

External Content Types in BCS

External content types are the core of BCS. They enable you to manage and reuse the metadata and behaviors of a business entity, such as Customer or Order, from a central location. They enable users to interact with that external data and process it in a more meaningful way.

For more information about using external content types in BCS, see External content types in SharePoint 2013.

How to Connect With SQL External Data Source

Open the SharePoint Designer 2013 and click on the open site icon:

Input the site URL which we need to open:

Enter your site credentials here:

Now we need to create the new external content type and here we have the options for changing the name of the content type and creating the connection for external data source:

And click on the hyperlink text “Click here to discover the external data source operations, now this window will open:

Click on the “Add Connection “button, we can create a new connection. Here we have the different options to select .NET Type, SQL Server, WCF Service.

Here we selected SQL server, now we need to provide the Server credentials:

Now, we can see all the tables and views from the database.

In this screen, we have the options for creating different types of operations against the database:

Click on the next button:

Parameters Configurations:

Options for Filter parameters Configuration:

Here we need to add new External List, Click on the “External List”:

Select the Site here and click ok button:

Enter the list name here and click ok button:

After that, refresh the SharePoint site, we can see the external list here and click on the list:

Here we have the error message “Access denied by Business Connectivity.”

Solution for this Error

SharePoint central admin, click on the Manage service application:

Click on the Business Data Connectivity Service:

Set the permission for this list:

Click ok after setting the permissions:

After that, refresh the site and hope this will work… but again, it has a problem. The error message like Login failed for user “NT AUTHORITY\ANONYMOUS LOGON”.

Solution for this Error

We need to edit the connection properties, the Authentication mode selects the value ‘BDC Identity’.

Then follow the below mentioned steps.

Open PowerShell and type the following lines:

$bdc = Get-SPServiceApplication | 
where {$_ -match “Business Data Connectivity Service”}
$bdc.RevertToSelfAllowed = $true
$bdc.Update();

Now it’s working fine.

And there is a chance for one more error like:

Database Connector has throttled the response.
The response from database contains more than '2000' rows. 
The maximum number of rows that can be read through Database Connector is '2000'. 
The limit can be changed via the 'Set-SPBusinessDataCatalogThrottleConfig' cmdlet

It’s because it depends on the number of recodes that exist in the table.

Solution for this Error

Follow the below steps:

Open PowerShell and type the following lines and execute:

$bcs = Get-SPServiceApplicationProxy | where{$_.GetType().FullName 
-eq (‘Microsoft.SharePoint.BusinessData.SharedService.’ + ‘BdcServiceApplicationProxy’)}
$BCSThrottle = Get-SPBusinessDataCatalogThrottleConfig -Scope database 
-ThrottleType items -ServiceApplicationProxy $bcs
Set-SPBusinessDataCatalogThrottleConfig -Identity $BCSThrottle -Maximum 1000000 -Default 20000

How To : Access SAP Business Data From Silverlight 4 Clients Using WCF RIA Services And LINQ to SAP

Introduction

The introduction of Microsoft’s WCF RIA Services for Silverlight 4 simplified very much the development process of N-tier business applications using Silverlight and ASP.NET. By using this new technology, we can also easily access and integrate SAP business data in Silverlight clients.

This article shows how to provide a SAP domain service as web service that will be consumed by a Silverlight client. The sample application will allow the user to query customer data. The service uses LINQ to SAP from Theobald Software to connect to a SAP R/3 system.

Project Setup

The first step in setting up a new Silverlight 4 project with WCF RIA Services is to create a solution using the Visual Studio template Silverlight Navigation Application:

Screenshot-01.png - Click to enlarge imageVisual Studio 2010 then asks you to create an additional web application, which hosts the Silverlight application. It’s important to select the checkbox Enable WCF RIA Services (see screenshot below):

SAP2Silverlight/Screenshot-02.pngAfter clicking the Ok button, Visual Studio generates a solution with two projects, one Silverlight 4 project and one ASP.NET project. In the next section, we will create the SAP data access layer using the LINQ to SAP designer.

LINQ to SAP

The LINQ to SAP provider and its Visual Studio 2010 designer offers a very handy way to design SAP interfaces visually. The designer will generate the code for the SAP data access layer automatically, similar to LINQ to SQL. The LINQ provider is part of the .NET library ERPConnect.net from Theobald Software. The company offers a demo version for download on its homepage.

The next step is to create the needed LINQ to SAP file by opening the Add New Item dialog:

Screenshot-03.png - Click to enlarge imageLINQ to SAP is internally called LINQ to ERP.

Clicking the Add button will create a new ERP file and opens the LINQ designer. Now, drag the Function object from the toolbox and drop it onto the designer surface. If you have not entered the SAP connection data so far, you are now asked to do so:

Screenshot-04.png - Click to enlarge imageEnter the connection data for your SAP R/3 system and then click the Ok button. Next, search for and select the SAP function module named SD_RFC_CUSTOMER_GET. The function module provides a list of customer data.

The RFC Function modules dialog opens and lets you define the necessary parameters:

SAP2Silverlight/Screenshot-05.pngIn the above function dialog, change the method name to GetCustomers and mark the Pass checkbox for theNAME1 parameter in the Exports tab. Also set the variable name to namePattern. On the Tables tab, mark the Return checkbox for the table parameter CUSTOMER_T and set the table and structure name to CustomerTable andCustomerRow:

SAP2Silverlight/Screenshot-06.pngAfter clicking the Ok button and saving the ERP file, the LINQ designer will generate a SAPContext class which contains a method called GetCustomers with an input parameter named namePattern. This method executes a search for SAP customer data allowing the user to enter a wildcard pattern. The method returns a table of customer data:

SAP2Silverlight/Screenshot-07.pngOn the LINQ designer level (click on the free part of the LINQ designer surface) property, Create Object Outside Of Context Class must be set to True:

Screenshot-08.png - Click to enlarge imageNow, we finally add a Customer class which we use in our SAP domain service later on. This class and its values will be transmitted to the Silverlight client by the WCF RIA Services. It’s important to set the Key attribute on the identifier fields for WCF RIA Services, otherwise the project will not compile:

Screenshot-09.png - Click to enlarge imageThat’s it! We now have our SAP data access layer ready to use and can start adding the domain service in the next section.

SAP Domain Service

The next step is to add the SAP domain service to our web project. A domain service is a specialized WCF service and is one of the core constructs of WCF RIA Services. The service exposes operations that can be called from the client generated code. On the client side, we use the domain context to access the domain service on the server side.

Add a new Domain Service Class and name it SAPService:

Screenshot-10.png - Click to enlarge imageIn the upcoming dialog, create an empty domain service class by just clicking the Ok button:

SAP2Silverlight/Screenshot-11.pngNext, we add the service operation GetCustomers to the SAP service with a name pattern parameter. The operation then returns a list of Customer objects. The Query attribute limits the result set to 200 entries.

The operation uses the visually designed SAP data access logic to retrieve the SAP customer data. First of all, an instance of the SAPContext class will be created using a connection string (see sample in code). For more details regarding the SAP connection string, see the ERPConnect.net manual.

The LINQ to SAP context class contains the GetCustomers method which we will call using the given namePatternparameter. Next, the operation creates an instance of the Customer class for each customer record returned by SAP.

The license code for the ERPConnect.net library is set in the constructor of our domain service class.

Screenshot-12.png - Click to enlarge imageThat’s all we need on the server side.

In the next section, we will implement the Silverlight client.

Silverlight Client

The implementation of the client side is straightforward. The home view contains a DataGrid control to display the list of customer data as well as a search area with TextBox and Button controls to allow users to enter name search pattern.

The click event handler of the load button, called OnLoadButtonClick, will execute the SAP service. The boilerplate code to access the web service was generated by WCF RIA Services in the subfolder Generated_Code in the Silverlight project.

First of all, an instance of the SAPContext will be created. Then, we load the query GetCustomersQuery and execute the service operation on the server side using WCF RIA Services. If the domain service returns an error, the callback anonymous method will mark the error as handled and display the error message.

If the execution of the service operation succeeded, the result set gets displayed in the DataGrid control.

Screenshot-13.png - Click to enlarge imageThe next screenshot shows the final result:

Screenshot-14.png - Click to enlarge imageThat’s it.

Summary

This article has shown how easily SAP customer data can be integrated within Silverlight clients using tools like WCF RIA Services and LINQ to SAP. It is quite simple to extend the SAP service to integrate all kinds of operations.

How To : SAP Integration with .Net 4.0 (SAP Connection Manager) & SharePoint

This is a simple, C# class library project to connect .NET applications with SAP.

ppt_img[1]

 

This component internally implements SAP .NET Connector 3.0. The SAP .NET Connector is a development environment that enables communication between the Microsoft .NET platform and SAP systems.

This connector supports RFCs and Web services, and allows you to write different applications such as Web form, Windows form, or console applications in the Microsoft Visual Studio .NET.

With the SAP .NET Connector, you can use all common programming languages, such as Visual Basic. NET, C#, or Managed C++.

Features
Using the SAP .NET Connector you can:

Write .NET Windows and Web form applications that have access to SAP business objects (BAPIs).

Develop client applications for the SAP Server.

Write RFC server applications that run in a .NET environment and can be installed starting from the SAP system.

Following are the steps to configure this utility on your project

Download and extract the attached file and place it on your machine. This package contains 3 libraries:

SAPConnectionManager.dll
sapnco.dll
sapnco_utils.dll

Now go to your project and add the reference of all these four libraries. Sapnco.dll and sapnco_utils.dll are inbuilt libraries used by SAP .NET Connector. SAPConnectionManager.dll is the main component which provides the connection between .NET and SAP.

Once the above steps are complete, you need to make certain entries related to SAP server on your configuration file. Here are the sample entries that you have to maintain on your own project. You need to change only the values which are marked in Bold. Rest remains unchanged.

<appSettings>
<add key=”ServerHost” value=”127.0.0.1″/>
<add key=”SystemNumber” value=”00″/>
<add key=”User” value=”sample”/>
<add key=”Password” value=”pass”/>
<add key=”Client” value=”50″/>
<add key=”Language” value=”EN”/>
<add key=”PoolSize” value=”5″/>
<add key=”PeakConnectionsLimit” value=”10″/>
<add key=”IdleTimeout” value=”600″/>
</appSettings>

To test this component, create one windows application. Add the reference of sapnco.dll, sapnco_utils.dll, andSAPConnectionManager.dll on your project.

Paste the below code on your Form lode event

SAPSystemConnect sapCfg = new SAPSystemConnect();
RfcDestinationManager.RegisterDestinationConfiguration(sapCfg);
RfcDestination rfcDest = null;
rfcDest = RfcDestinationManager.GetDestination(“Dev”);

sap_integration_en_round[1]
That’s it. Now you are successfully connected with your SAP Server. Next you need to call SAP business objects (BAPIs) and extract the data and stored it in DataSet or list.

Demo Code available on request!!

How to: Create a provider-hosted app for SharePoint to access SAP data via SAP Gateway for Microsoft

You can create an app for SharePoint that reads and writes SAP data, and optionally reads and writes SharePoint data, by using SAP Gateway for Microsoft and the Azure AD Authentication Library for .NET. This article provides the procedures for how you can design the app for SharePoint to get authorized access to SAP.

hero-for-hire_basic-layout_600sap_integration_en_round[1]


The following are prerequisites to the procedures in this article:

sap_integration_en_round[2]

Code sample: SharePoint 2013: Using the SAP Gateway to Microsoft in an app for SharePoint

OAuth 2.0 in Azure AD enables applications to access multiple resources hosted by Microsoft Azure and SAP Gateway for Microsoft is one of them. With OAuth 2.0, applications, in addition to users, are security principals. Application principals require authentication and authorization to protected resources in addition to (and sometimes instead of) users. The process involves an OAuth “flow” in which the application, which can be an app for SharePoint, obtains an access token (and refresh token) that is accepted by all of the Microsoft Azure-hosted services and applications that are configured to use Azure AD as an OAuth 2.0 authorization server. The process is very similar to the way that the remote components of a provider-hosted app for SharePoint gets authorization to SharePoint as described in Creating apps for SharePoint that use low-trust authorization and its child articles. However, the ACS authorization system uses Microsoft Azure Access Control Service (ACS) as the trusted token issuer rather than Azure AD.

Tip Tip
If your app for SharePoint accesses SharePoint in addition to accessing SAP Gateway for Microsoft, then it will need to use both systems: Azure AD to get an access token to SAP Gateway for Microsoft and the ACS authorization system to get an access token to SharePoint. The tokens from the two sources are not interchangeable. For more information, see Optionally, add SharePoint access to the ASP.NET application.

For a detailed description and diagram of the OAuth flow used by OAuth 2.0 in Azure AD, see Authorization Code Grant Flow. (For a similar description, and a diagram, of the flow for accessing SharePoint, see See the steps in the Context Token flow.)

Create the Visual Studio solution

  1. Create an App for SharePoint project in Visual Studio with the following steps. (The continuing example in this article assumes you are using C#; but you can start an app for SharePoint project in the Visual Basic section of the new project templates as well.)
    1. In the New app for SharePoint wizard, name the project and click OK. For the continuing example, use SAP2SharePoint.
    2. Specify the domain URL of your Office 365 Developer Site (including a final forward slash) as the debugging site; for example, https://<O365_domain&gt;.sharepoint.com/. Specify Provider-hosted as the app type. Click Next.
    3. Choose a project type. For the continuing example, choose ASP.NET Web Forms Application. (You can also make ASP.NET MVC applications that access SAP Gateway for Microsoft.) Click Next.
    4. Choose Azure ACS as the authentication system. (Your app for SharePoint will use this system if it accesses SharePoint. It does not use this system when it accesses SAP Gateway for Microsoft.) Click Finish.
  2. After the project is created, you are prompted to login to the Office 365 account. Use the credentials of an account administrator; for example Bob@<O365_domain>.onmicrosoft.com.
  3. There are two projects in the Visual Studio solution; the app for SharePoint proper project and an ASP.NET web forms project. Add the Active Directory Authentication Library (ADAL) package to the ASP.NET project with these steps:
    1. Right-click the References folder in the ASP.NET project (named SAP2SharePointWeb in the continuing example) and select Manage NuGet Packages.
    2. In the dialog that opens, select Online on the left. Enter Microsoft.IdentityModel.Clients.ActiveDirectory in the search box.
    3. When the ADAL library appears in the search results, click the Install button beside it, and accept the license when prompted.
  4. Add the Json.net package to the ASP.NET project with these steps:
    1. Enter Json.net in the search box. If this produces too many hits, try searching on Newtonsoft.json.
    2. When Json.net appears in the search results, click the Install button beside it.
  5. Click Close.

Register your web application with Azure AD

  1. Login into the Azure Management portal with your Azure administrator account.
    Note Note
    For security purposes, we recommend against using an administrator account when developing apps.
  2. Choose Active Directory on the left side.
  3. Click on your directory.
  4. Choose APPLICATIONS (on the top navigation bar).
  5. Choose Add on the toolbar at the bottom of the screen.
  6. On the dialog that opens, choose Add an application my organization is developing.
  7. On the ADD APPLICATION dialog, give the application a name. For the continuing example, use ContosoAutomobileCollection.
  8. Choose Web Application And/Or Web API as the application type, and then click the right arrow button.
  9. On the second page of the dialog, use the SSL debugging URL from the ASP.NET project in the Visual Studio solution as the SIGN-ON URL. You can find the URL using the following steps. (You need to register the app initially with the debugging URL so that you can run the Visual Studio debugger (F5). When your app is ready for staging, you will re-register it with its staging Azure Web Site URL. Modify the app and stage it to Azure and Office 365.)
    1. Highlight the ASP.NET project in Solution Explorer.
    2. In the Properties window, copy the value of the SSL URL property. An example is https://localhost:44300/.
    3. Paste it into the SIGN-ON URL on the ADD APPLICATION dialog.
  10. For the APP ID URI, give the application a unique URI, such as the application name appended to the end of the SSL URL; for example https://localhost:44300/ContosoAutomobileCollection.
  11. Click the checkmark button. The Azure dashboard for the application opens with a success message.
  12. Choose CONFIGURE on the top of the page.
  13. Scroll to the CLIENT ID and make a copy of it. You will need it for a later procedure.
  14. In the keys section, create a key. It won’t appear initially. Click SAVE at the bottom of the page and the key will be visible. Make a copy of it. You will need it for a later procedure.
  15. Scroll to permissions to other applications and select your SAP Gateway for Microsoft service application.
  16. Open the Delegated Permissions drop down list and enable the boxes for the permissions to the SAP Gateway for Microsoft service that your app for SharePoint will need.
  17. Click SAVE at the bottom of the screen.

Configure the application to communicate with Azure AD

  1. In Visual Studio, open the web.config file in the ASP.NET project.
  2. In the <appSettings> section, the Office Developer Tools for Visual Studio have added elements for the ClientID and ClientSecret of the app for SharePoint. (These are used in the Azure ACS authorization system if the ASP.NET application accesses SharePoint. You can ignore them for the continuing example, but do not delete them. They are required in provider-hosted apps for SharePoint even if the app is not accessing SharePoint data. Their values will change each time you press F5 in Visual Studio.) Add the following two elements to the section. These are used by the application to authenticate to Azure AD. (Remember that applications, as well as users, are security principals in OAuth-based authentication and authorization systems.)
    <add key="ida:ClientID" value="" />
    <add key="ida:ClientKey" value="" />
    
  3. Insert the client ID that you saved from your Azure AD directory in the earlier procedure as the value of the ida:ClientID key. Leave the casing and punctuation exactly as you copied it and be careful not to include a space character at the beginning or end of the value. For the ida:ClientKey key use the key that you saved from the directory. Again, be careful not to introduce any space characters or change the value in any way. The <appSettings> section should now look something like the following. (The ClientId value may have a GUID or an empty string.)
    <appSettings>
      <add key="ClientId" value="" />
      <add key="ClientSecret" value="LypZu2yVajlHfPLRn5J2hBrwCk5aBOHxE4PtKCjIQkk=" />
      <add key="ida:ClientID" value="4da99afe-08b5-4bce-bc66-5356482ec2df" />
      <add key="ida:ClientKey" value="URwh/oiPay/b5jJWYHgkVdoE/x7gq3zZdtcl/cG14ss=" />
    </appSettings>
    
    NoteNote
    Your application is known to Azure AD by the “localhost” URL you used to register it. The client ID and client key are associated with that identity. When you are ready to stage your application to an Azure Web Site, you will re-register it with a new URL.
  4. Still in the appSettings section, add an Authority key and set its value to the Office 365 domain (some_domain.onmicrosoft.com) of your organizational account. In the continuing example, the organizational account is Bob@<O365_domain>.onmicrosoft.com, so the authority is <O365_domain>.onmicrosoft.com.
    <add key="Authority" value="<O365_domain>.onmicrosoft.com" />
    
  5. Still in the appSettings section, add an AppRedirectUrl key and set its value to the page that the user’s browser should be redirected to after the ASP.NET app has obtained an authorization code from Azure AD. Usually, this is the same page that the user was on when the call to Azure AD was made. In the continuing example, use the SSL URL value with “/Pages/Default.aspx” appended to it as shown below. (This is another value that you will change for staging.)
    Copy
    <add key="AppRedirectUrl" value="https://localhost:44322/Pages/Default.aspx" />
    
  6. Still in the appSettings section, add a ResourceUrl key and set its value to the APP ID URI of SAP Gateway for Microsoft (not the APP ID URI of your ASP.NET application). Obtain this value from the SAP Gateway for Microsoft administrator. The following is an example.
    <add key="ResourceUrl" value="http://<SAP_gateway_domain>.cloudapp.net/" />
    

    The <appSettings> section should now look something like this:

    <appSettings>
      <add key="ClientId" value="06af1059-8916-4851-a271-2705e8cf53c6" />
      <add key="ClientSecret" value="LypZu2yVajlHfPLRn5J2hBrwCk5aBOHxE4PtKCjIQkk=" />
      <add key="ida:ClientID" value="4da99afe-08b5-4bce-bc66-5356482ec2df" />
      <add key="ida:ClientKey" value="URwh/oiPay/b5jJWYHgkVdoE/x7gq3zZdtcl/cG14ss=" />
      <add key="Authority" value="<O365_domain>.onmicrosoft.com" />
      <add key="AppRedirectUrl" value="https://localhost:44322/Pages/Default.aspx" />
      <add key="ResourceUrl" value="http://<SAP_gateway_domain>.cloudapp.net/" />
    </appSettings>
    
  7. Save and close the web.config file.
    Tip Tip
    Do not leave the web.config file open when you run the Visual Studio debugger (F5). The Office Developer Tools for Visual Studio change the ClientId value (not the ida:ClientID) every time you press F5. This requires you to respond to a prompt to reload the web.config file, if it is open, before debugging can execute.

Add a helper class to authenticate to Azure AD

  1. Right-click the ASP.NET project and use the Visual Studio item adding process to add a new class file to the project named AADAuthHelper.cs.
  2. Add the following using statements to the file.
    using Microsoft.IdentityModel.Clients.ActiveDirectory;
    using System.Configuration;
    using System.Web.UI;
    
    
  3. Change the access keyword from public to internal and add the static keyword to the class declaration.
    internal static class AADAuthHelper
    
  4. Add the following fields to the class. These fields store information that your ASP.NET application uses to get access tokens from AAD.
    private static readonly string _authority = ConfigurationManager.AppSettings["Authority"];
    private static readonly string _appRedirectUrl = ConfigurationManager.AppSettings["AppRedirectUrl"];
    private static readonly string _resourceUrl = ConfigurationManager.AppSettings["ResourceUrl"];     
            
    private static readonly ClientCredential _clientCredential = new ClientCredential(
                               ConfigurationManager.AppSettings["ida:ClientID"],
                               ConfigurationManager.AppSettings["ida:ClientKey"]);
    
    private static readonly AuthenticationContext _authenticationContext = 
                new AuthenticationContext("https://login.windows.net/common/" + 
                                          ConfigurationManager.AppSettings["Authority"]);
    
  5. Add the following property to the class. This property holds the URL to the Azure AD login screen.
    private static string AuthorizeUrl
    {
        get
        {
            return string.Format("https://login.windows.net/{0}/oauth2/authorize?response_type=code&redirect_uri={1}&client_id={2}&state={3}",
                _authority,
                _appRedirectUrl,
                _clientCredential.OwnerId,
                Guid.NewGuid().ToString());
        }
    }
    
    
  6. Add the following properties to the class. These cache the access and refresh tokens and check their validity.
    public static Tuple<string, DateTimeOffset> AccessToken
    {
        get {
    return HttpContext.Current.Session["AccessTokenWithExpireTime-" + _resourceUrl] 
           as Tuple<string, DateTimeOffset>;
        }
    
        set { HttpContext.Current.Session["AccessTokenWithExpireTime-" + _resourceUrl] = value; }
    }
    
    private static bool IsAccessTokenValid
    {
       get 
       { 
           return AccessToken != null &&
           !string.IsNullOrEmpty(AccessToken.Item1) &&
           AccessToken.Item2 > DateTimeOffset.UtcNow;
       }
    }
    
    private static string RefreshToken
    {
        get { return HttpContext.Current.Session["RefreshToken" + _resourceUrl] as string; }
        set { HttpContext.Current.Session["RefreshToken-" + _resourceUrl] = value; }
    }
    
    private static bool IsRefreshTokenValid
    {
        get { return !string.IsNullOrEmpty(RefreshToken); }
    }
    
    
  7. Add the following methods to the class. These are used to check the validity of the authorization code and to obtain an access token from Azure AD by using either an authentication code or a refresh token.
    private static bool IsAuthorizationCodeNotNull(string authCode)
    {
        return !string.IsNullOrEmpty(authCode);
    }
    
    private static Tuple<Tuple<string,DateTimeOffset>,string> AcquireTokensUsingAuthCode(string authCode)
    {
        var authResult = _authenticationContext.AcquireTokenByAuthorizationCode(
                    authCode,
                    new Uri(_appRedirectUrl),
                    _clientCredential,
                    _resourceUrl);
    
        return new Tuple<Tuple<string, DateTimeOffset>, string>(
                    new Tuple<string, DateTimeOffset>(authResult.AccessToken, authResult.ExpiresOn), 
                    authResult.RefreshToken);
    }
    
    private static Tuple<string, DateTimeOffset> RenewAccessTokenUsingRefreshToken()
    {
        var authResult = _authenticationContext.AcquireTokenByRefreshToken(
                             RefreshToken,
                             _clientCredential.OwnerId,
                             _clientCredential,
                             _resourceUrl);
    
        return new Tuple<string, DateTimeOffset>(authResult.AccessToken, authResult.ExpiresOn);
    }
    
    
  8. Add the following method to the class. It is called from the ASP.NET code behind to obtain a valid access token before a call is made to get SAP data via SAP Gateway for Microsoft.
    internal static void EnsureValidAccessToken(Page page)
    {
        if (IsAccessTokenValid) 
        {
            return;
        }
        else if (IsRefreshTokenValid) 
        {
            AccessToken = RenewAccessTokenUsingRefreshToken();
            return;
        }
        else if (IsAuthorizationCodeNotNull(page.Request.QueryString["code"]))
        {
            Tuple<Tuple<string, DateTimeOffset>, string> tokens = null;
            try
            {
                tokens = AcquireTokensUsingAuthCode(page.Request.QueryString["code"]);
            }
            catch 
            {
                page.Response.Redirect(AuthorizeUrl);
            }
            AccessToken = tokens.Item1;
            RefreshToken = tokens.Item2;
            return;
        }
        else
        {
            page.Response.Redirect(AuthorizeUrl);
        }
    }
    
Tip Tip
The AADAuthHelper class has only minimal error handling. For a robust, production quality app for SharePoint, add more error handling as described in this MSDN node: Error Handling in OAuth 2.0.

Create data model classes

  1. Create one or more classes to model the data that your app gets from SAP. In the continuing example, there is just one data model class. Right-click the ASP.NET project and use the Visual Studio item adding process to add a new class file to the project named Automobile.cs.
  2. Add the following code to the body of the class:
    public string Price;
    public string Brand;
    public string Model;
    public int Year;
    public string Engine;
    public int MaxPower;
    public string BodyStyle;
    public string Transmission;
    

Add code behind to get data from SAP via the SAP Gateway for Microsoft

  1. Open the Default.aspx.cs file and add the following using statements.
    using System.Net;
    using Newtonsoft.Json.Linq;
    
  2. Add a const declaration to the Default class whose value is the base URL of the SAP OData endpoint that the app will be accessing. The following is an example:
    private const string SAP_ODATA_URL = @"https://<SAP_gateway_domain>.cloudapp.net:8081/perf/sap/opu/odata/sap/ZCAR_POC_SRV/";
    
  3. The Office Developer Tools for Visual Studio have added a Page_PreInit method and a Page_Load method. Comment out the code inside the Page_Load method and comment out the whole Page_Init method. This code is not used in this sample. (If your app for SharePoint is going to access SharePoint, then you restore this code. See Optionally, add SharePoint access to the ASP.NET application.)
  4. Add the following line to the top of the Page_Load method. This will ease the process of debugging because your ASP.NET application is communicating with SAP Gateway for Microsoft using SSL (HTTPS); but your “localhost:port” server is not configured to trust the certificate of SAP Gateway for Microsoft. Without this line of code, you would get an invalid certificate warning before Default.aspx will open. Some browsers allow you to click past this error, but some will not let you open Default.aspx at all.
    ServicePointManager.ServerCertificateValidationCallback = (s, cert, chain, errors) => true;
    
    Important noteImportant
    Delete this line when you are ready to deploy the ASP.NET application to staging. See Modify the app and stage it to Azure and Office 365.
  5. Add the following code to the Page_Load method. The string you pass to the GetSAPData method is an OData query.
    if (!IsPostBack)
    {
        GetSAPData("DataCollection?$top=3");
    }
    
    
  6. Add the following method to the Default class. This method first ensures that the cache for the access token has a valid access token in it that has been obtained from Azure AD. It then creates an HTTP GET Request that includes the access token and sends it to the SAP OData endpoint. The result is returned as a JSON object that is converted to a .NET List object. Three properties of the items are used in an array that is bound to the DataListView.
    private void GetSAPData(string oDataQuery)
    {
        AADAuthHelper.EnsureValidAccessToken(this);
    
        using (WebClient client = new WebClient())
        {
            client.Headers[HttpRequestHeader.Accept] = "application/json";
            client.Headers[HttpRequestHeader.Authorization] = "Bearer " + AADAuthHelper.AccessToken.Item1;
            var jsonString = client.DownloadString(SAP_ODATA_URL + oDataQuery);
            var jsonValue = JObject.Parse(jsonString)["d"]["results"];
            var dataCol = jsonValue.ToObject<List<Automobile>>();
    
            var dataList = dataCol.Select((item) => {
                return item.Brand + " " + item.Model + " " + item.Price;
                }).ToArray();
    
            DataListView.DataSource = dataList;
            DataListView.DataBind();
        }
    }
    
    

Create the user interface

  1. Open the Default.aspx file and add the following markup to the form of the page:
    <div>
      <h3>Data from SAP via SAP Gateway for Microsoft</h3>
    
      <asp:ListView runat="server" ID="DataListView">
        <ItemTemplate>
          <tr runat="server">
            <td runat="server">
              <asp:Label ID="DataLabel" runat="server"
                Text="<%# Container.DataItem.ToString()%>" /><br />
            </td>
          </tr>
        </ItemTemplate>
      </asp:ListView>
    </div>
    
  2. Optionally, give the web page the “look ‘n’ feel” of a SharePoint page with the SharePoint Chrome Control and the host SharePoint website’s style sheet.

Test the app with F5 in Visual Studio

  1. Press F5 in Visual Studio.
  2. The first time that you use F5, you may be prompted to login to the Developer Site that you are using. Use the site administrator credentials. In the continuing example, it is Bob@<O365_domain>.onmicrosoft.com.
  3. The first time that you use F5, you are prompted to grant permissions to the app. Click Trust It.
  4. After a brief delay while the access token is being obtained, the Default.aspx page opens. Verify that the SAP data appears.

Optionally, add SharePoint access to the ASP.NET application


Of course, your app for SharePoint doesn’t have to expose only SAP data in a web page launched from SharePoint. It can also create, read, update, and delete (CRUD) SharePoint data. Your code behind can do this using either the SharePoint client object model (CSOM) or the REST APIs of SharePoint. The CSOM is deployed as a pair of assemblies that the Office Developer Tools for Visual Studio automatically included in the ASP.NET project and set to Copy Local in Visual Studio so that they are included in the ASP.NET application package. For information about using CSOM, start with How to: Complete basic operations using SharePoint 2013 client library code. For information about using the REST APIs, start with Understanding and Using the SharePoint 2013 REST Interface.Regardless, of whether you use CSOM or the REST APIs to access SharePoint, your ASP.NET application must get an access token to SharePoint, just as it does to SAP Gateway for Microsoft. See Understand authentication and authorization to SAP Gateway for Microsoft and SharePoint above. The procedure below provides some basic guidance about how to do this, but we recommend that you first read the following articles:

  1. Open the Default.aspx.cs file and uncomment the Page_PreInit method. Also uncomment the code that the Office Developer Tools for Visual Studio added to the Page_Load method.
  2. If your app for SharePoint is going to access SharePoint data, then you have to cache the SharePoint context token that is POSTed to the Default.aspx page when the app is launched in SharePoint. This is to ensure that the SharePoint context token is not lost when the browser is redirected following the Azure AD authentication. (You have several options for how to cache this context. See OAuth tokens.) The Office Developer Tools for Visual Studio add a SharePointContext.cs file to the ASP.NET project that does most of the work. To use the session cache, you simply add the following code inside the “if (!IsPostBack)” block before the code that calls out to SAP Gateway for Microsoft:
    if (HttpContext.Current.Session["SharePointContext"] == null) 
    {
         HttpContext.Current.Session["SharePointContext"]
            = SharePointContextProvider.Current.GetSharePointContext(Context);
    }
    
  3. The SharePointContext.cs file makes calls to another file that the Office Developer Tools for Visual Studio added to the project: TokenHelper.cs. This file provides most of the code needed to obtain and use access tokens for SharePoint. However, it does not provide any code for renewing an expired access token or an expired refresh token. Nor does it contain any token caching code. For a production quality app for SharePoint, you need to add such code. The caching logic in the preceding step is an example. Your code should also cache the access token and reuse it until it expires. When the access token is expired, your code should use the refresh token to get a new access token. We recommend that you read OAuth tokens.
  4. Add the data calls to SharePoint using either CSOM or REST. The following example is a modification of CSOM code that Office Developer Tools for Visual Studio adds to the Page_Load method. In this example, the code has been moved to a separate method and it begins by retrieving the cached context token.
    Copy
    private void GetSharePointTitle()
    {
        var spContext = HttpContext.Current.Session["SharePointContext"] as SharePointContext;
        using (var clientContext = spContext.CreateUserClientContextForSPHost())
        {
            clientContext.Load(clientContext.Web, web => web.Title);
            clientContext.ExecuteQuery();
            SharePointTitle.Text = "SharePoint web site title is: " + clientContext.Web.Title;
        }
    }
    
  5. Add UI elements to render the SharePoint data. The following shows the HTML control that is referenced in the preceding method:
    <h3>SharePoint title</h3><asp:Label ID="SharePointTitle" runat="server"></asp:Label><br />
    
Note Note
While you are debugging the app for SharePoint, the Office Developer Tools for Visual Studio re-register it with Azure ACS each time you press F5 in Visual Studio. When you stage the app for SharePoint, you have to give it a long-term registration. See the section Modify the app and stage it to Azure and Office 365.

Modify the app and stage it to Azure and Office 365


When you have finished debugging the app for SharePoint using F5 in Visual Studio, you need to deploy the ASP.NET application to an actual Azure Web Site.

Create the Azure Web Site

  1. In the Microsoft Azure portal, open WEB SITES on the left navigation bar.
  2. Click NEW at the bottom of the page and on the NEW dialog select WEB SITE | QUICK CREATE.
  3. Enter a domain name and click CREATE WEB SITE. Make a copy of the new site’s URL. It will have the form my_domain.azurewebsites.net.

Modify the code and markup in the application

  1. In Visual Studio, remove the line ServicePointManager.ServerCertificateValidationCallback = (s, cert, chain, errors) => true; from the Default.aspx.cs file.
  2. Open the web.config file of the ASP.NET project and change the domain part of the value of the AppRedirectUrl key in the appSettings section to the domain of the Azure Web Site. For example, change <add key=”AppRedirectUrl” value=”https://localhost:44322/Pages/Default.aspx&#8221; /> to <add key=”AppRedirectUrl” value=”https://my_domain.azurewebsites.net/Pages/Default.aspx&#8221; />.
  3. Right-click the AppManifest.xml file in the app for SharePoint project and select View Code.
  4. In the StartPage value, replace the string ~remoteAppUrl with the full domain of the Azure Web Site including the protocol; for example https://my_domain.azurewebsites.net. The entire StartPage value should now be: https://my_domain.azurewebsites.net/Pages/Default.aspx. (Usually, the StartPage value is exactly the same as the value of the AppRedirectUrl key in the web.config file.)

Modify the AAD registration and register the app with ACS

  1. Login into Azure Management portal with your Azure administrator account.
  2. Choose Active Directory on the left side.
  3. Click on your directory.
  4. Choose APPLICATIONS (on the top navigation bar).
  5. Open the application you created. In the continuing example, it is ContosoAutomobileCollection.
  6. For each of the following values, change the “localhost:port” part of the value to the domain of your new Azure Web Site:
    • SIGN-ON URL
    • APP ID URI
    • REPLY URL

    For example, if the APP ID URI is https://localhost:44304/ContosoAutomobileCollection, change it to https://<my_domain&gt;.azurewebsites.net/ContosoAutomobileCollection.

  7. Click SAVE at the bottom of the screen.
  8. Register the app with Azure ACS. This must be done even if the app does not access SharePoint and will not use tokens from ACS, because the same process also registers the app with the App Management Service of the Office 365 subscription, which is a requirement. You perform the registration on the AppRegNew.aspx page of any SharePoint website in the Office 365 subscription. For details, see Guidelines for registering apps for SharePoint 2013. As part of this process you will obtain a new client ID and client secret. Insert these values in the web.config for the ClientId (not ida:ClientID) and ClientSecret keys.
    Caution note Caution
    If for any reason you press F5 after making this change, the Office Developer Tools for Visual Studio will overwrite one or both of these values. For that reason, you should keep a record of the values obtained with AppRegNew.aspx and always verify that the values in the web.config are correct just before you publish the ASP.NET application.

Publish the ASP.NET application to Azure and install the app to SharePoint

  1. There are several ways to publish an ASP.NET application to an Azure Web Site. For more information, see How to Deploy an Azure Web Site.
  2. In Visual Studio, right-click the SharePoint app project and select Package. On the Publish your app page that opens, click Package the app. File explorer opens to the folder with the app package.
  3. Login to Office 365 as a global administrator, and navigate to the organization app catalog site collection. (If there isn’t one, create it. See Use the App Catalog to make custom business apps available for your SharePoint Online environment.)
  4. Upload the app package to the app catalog.
  5. Navigate to the Site Contents page of any website in the subscription and click add an app.
  6. On the Your Apps page, scroll to the Apps you can add section and click the icon for your app.
  7. After the app has installed, click it’s icon on the Site Contents page to launch the app.

For more information about installing apps for SharePoint, see Deploying and installing apps for SharePoint: methods and options.

Deploying the app to production


When you have finished all testing you can deploy the app in production. This may require some changes.

  1. If the production domain of the ASP.NET application is different from the staging domain, you will have to change AppRedirectUrl value in the web.config and the StartPage value in the AppManifest.xml file, and repackage the app for SharePoint. See the procedure Modify the code and markup in the application above.
  2. The change in domain also requires that you edit the apps registration with AAD. See the procedure Modify the AAD registration and register the app with ACS above.
  3. The change in domain also requires that you re-register the app with ACS (and the subscription’s App Management Service) as described in the same procedure. (There is no way to edit an app’s registration with ACS.) However, it is not necessary to generate a new client ID or client secret on the AppRegNew.aspx page. You can copy the original values from the ClientId (not ida:ClientID) and ClientSecret keys of the web.config into the AppRegNew form. If you do generate new ones, be sure to copy the new values to the keys in web.config.

How To : Use Azure BizTalk Services to Integrate with an On-Premises SAP Server

biztalk_adapter_2004-1[1]hero-for-hire_basic-layout_600

Microsoft Azure BizTalk Services provides a rich set of integration capabilities enabling organizations to create hybrid solutions such that their customer or partner facing applications are hosted on Azure, while the data related to customers or partners is stored on-premises using LOB applications.

To demonstrate how to integrate applications with an on-premises LOB application using BizTalk Services, let us consider a scenario involving two business partners, Fabrikam and Contoso.

Business Scenario

Contoso sends a purchase order (PO) message to Fabrikam in an X12 Electronic Data Interchange (EDI) format using the PO (X12 850) schema. Fabrikam (that uses an SAP Server to manage partner data), accepts PO from its partners using the ORDERS05 IDOCS.

To enable Contoso to send a PO directly to Fabrikam’s on-premises SAP Server, Fabrikam decides to use Azure’s integration offering, BizTalk Services, to set up a hybrid integration scenario where the integration layer is hosted on and the SAP Server is within the organization’s firewall. Fabrikam uses BizTalk Services in the following ways to enable this hybrid integration scenario:

  1. Fabrikam uses Microsoft Azure BizTalk Services SDK to create a BizTalk Service project. The project includes a XML One-Way Bridge to send messages to a relay endpoint, which in turns sends the message to the on-premise SAP system.
  2. Fabrikam uses the BizTalk Adapter Service component available with BizTalk Services to expose the Send operation on ORDERS05 IDOC as an operation using Service Bus relay endpoint. The XML One-Way Bridge sends messages to this relay endpoint. Fabrikam also creates the schema for Send operation using BizTalk Adapter Service and includes the schema as part of the BizTalk Service project.
    noteNote
    A Send operation on an IDOC is an operation that is exposed by the BizTalk Adapter Pack on any IDOC to send the IDOC to the SAP Server. BizTalk Adapter Service uses BizTalk Adapter Pack to connect to an SAP Server.
  3. Fabrikam uses the Transform component available with BizTalk Services to create a map to transform the PO message in X12 format into the schema required by the SAP Server to invoke the Send operation on the ORDERS05 IDOC.
  4. Fabrikam uses the Microsoft Azure BizTalk Services Portal available with BizTalk Services to create and deploy an EDI agreement under the BizTalk Services subscription that processes the X12 850 PO message. As part of the message processing, the agreement also does the following:
    1. Receives an X12 850 PO message over FTP.
    2. Transforms the X12 PO message into the schema required by the SAP Server using the transform created earlier.
    3. Routes the transformed message to the XML One-Way Bridge that eventually routes the message to a relay endpoint created for sending a PO message to an SAP Server. Fabrikam earlier exposed (as explained in bullet 1 above) the Send operation on ORDERS05 IDOC as a relay endpoint, to enable partners to send PO messages using BizTalk Adapter Service.

Once this is set up, Contoso drops an X12 850 PO message to the FTP location. This message is consumed by the EDI receive pipeline, which processes the message, transforms it to an ORDERS05 IDOC, and routes it to the intermediary XML bridge. The bridge then routes the message to the relay endpoint on Service Bus, which is then sent to the on-premises SAP Server. The following illustration represents the same scenario.

SAP Integraiton scenario

How to Use This Article

 

This tutorial is written around a sample, SAPIntegration, which is available as part of the download (SAPIntegration.zip) from the MSDN Code Gallery. You could either use the SAPIntegration sample and go through this tutorial to understand how the sample was built or just use this tutorial to create your own application.

This tutorial is targeted towards the second approach so that you get to understand how this application was built. Also, to be consistent with the sample, the names of artifacts (e.g. schemas, transforms, etc.) used in this tutorial are same as that in the sample.

The sample available from the MSDN code gallery contains only half the solution, which can be developed at design-time on your computer. The sample cannot include the configuration that you must do on the BizTalk Services Portal on Azure.

For that, you must follow the steps in this tutorial to set up your EDI bridge. Even though Microsoft recommends that you follow the tutorial to best understand the concepts and procedures, if you really wish to use the sample, this is what you should do:

  • Download the SAPIntegration.zip package, extract the SAPIntegration sample and make relevant changes like providing your service namespace, issuer name, issuer key, SAP Server details, etc. After changing the sample, deploy the application to get the endpoint URL at which the XML One-Way Bridge is deployed.
  • Use the BizTalk Services Portal to configure the Receive settings as described at Step 5: Create and Deploy the EDI Receive Pipeline and follow the procedures to route messages from the EDI Receive bridge to the XML One-Way Bridge you already deployed.
  • Drop a test message at the FTP location configured as part of the agreement and verify that the application works as expected.
    • If the message is successfully processed, it will be routed to the SAP Server and you can verify the ORDERS IDOC using the SAP GUI.
    • If the EDI agreement fails to process the message, the failure/error messages are routed to a relay endpoint on Service Bus. To receive such messages, you must set up a relay receiver service that receives any message that comes to that specific relay endpoint. More details on why you need this service and how to use it are available at Step 6: Test the Solution.

Steps in the Solution :

 

  • Step 1: Set up Your Computer
  • Step 2: Expose a Relay Endpoint to Invoke Operations on ORDERS05 IDOC
  • Step 3: Transform the X12 850 PO Message to the ORDERS05 Message
  • Step 4: Create and Deploy the XML Bridge
  • Step 5: Create and Deploy the EDI Receive Pipeline
  • Step 6: Test the Solution

Step 1: Set up Your Computer


This topic provides you with instruction and pointers to set up your computer on which you will perform the steps to set up the hybrid integration scenario described in Tutorial: Using Azure BizTalk Services to Integrate with an On-Premises SAP Server. You must do the following to set up your computer:

  • Install Microsoft Azure BizTalk Services SDK. You can download the installer from http://go.microsoft.com/fwlink/?LinkId=235057. You use this SDK to configure and deploy the XML One-Way Bridge that sits between the EDI agreement and the relay endpoint.
  • Install BizTalk Adapter Service. You use this to expose the Send operation on an IDOC as a relay endpoint on Service Bus.You can download the installer from http://go.microsoft.com/fwlink/?LinkId=235057. Refer to the BizTalk Services installation guide at http://go.microsoft.com/fwlink/?LinkId=237092 to install the software prerequisites for BizTalk Services SDK and BizTalk Adapter Service.
  • Install the WCF LOB Adapter SDK. This is required for installing the Adapter Pack on the computer.
  • Install the Adapter Pack. This contains the SAP adapter that is required to establish connectivity to an SAP Server and for exposing SAP artifacts as operations.
  • Install the SAP client libraries. The SAP adapter needs these libraries to connect to an SAP Server. For information on where to install the SAP client libraries from, refer to the Adapter Pack installation guide (BizTalkAdapterPack_InstallationGuide.htm) at http://go.microsoft.com/fwlink/?LinkId=240161.
  • Download and extract the EDI message schemas (MicrosoftEdiXSDTemplates.zip) available at http://go.microsoft.com/fwlink/?LinkId=235057. This contains the X12 850 Purchase Order message schema that is required for the business scenario we use for this article.

After you have finished installing and downloading these components, you are ready to start setting up the business scenario.

Step 2: Expose a Relay Endpoint to Invoke Operations on ORDERS05 IDOC

This topic has not yet been rated Rate this topic

Updated: November 21, 2013

There are two main steps required to expose an SAP artifact as an operation that can be invoked by sending a message over Service Bus – create an LOB Target and an LOB Relay.

  • An LOB Target defines how an Azure application communicates to the Line-of-Business (LOB) system. The LOB Target controls the LOB system connection URI, the operation to perform, and the connection credentials.
  • An LOB Relay is a WCF service running within an organizations firewall and listens to a relay endpoint on the Service Bus. As the name suggests, the LOB Relay acts as a relay between the Service Bus relay endpoint and the LOB system. It receives the message at the Service Bus relay endpoint and passes it on to the relevant LOB system using the LOB Target configuration.

For more information, see BizTalk Adapter Service Architecture. In this topic, we will create an LOB Target and an LOB Relay to expose the Send operation on the ORDERS05 IDOC.

To create an LOB Target and LOB Relay

  1. Open Visual Studio (as an administrator), create a new BizTalk Service project, and name it SAPIntegration.
  2. You first start with adding a BizTalk Adapter Service server. This is the server where you installed the Runtime component of BizTalk Adapter Service. To add a BizTalk Adapter Service server, from the Server Explorer in Visual Studio, right-click BizTalk Adapter Services, and select Add BizTalk Adapter Service. In the Add BizTalk Adapter Service dialog box, enter the URL of the WCF service that monitors that Service Bus relay service, and then click OK.

    Add Service Bus Connect ServerBecause you have all the components of BizTalk Adapter Service installed on the same computer, the URL for that service will be http://localhost:8080/BAService/ManagementService.svc/.

    noteNote
    If you had installed BizTalk Adapter Service Runtime component on a separate computer, you would have replaced ‘localhost’ in the above URL with the name of that computer.
  3. In this tutorial we are creating an application to integrate with SAP, so we must add an SAP target. Expand the newly added server, expand LOB Types, right-click SAP, and select Add SAP Target.

    Add an SAP TargetThe Add a Target wizard starts. Perform the following steps to create an LOB Target.

    1. Read the information on the Before You Begin page, and then click Next.
    2. On the Connection Parameters page, specify the details for the SAP Server to connect to and the credentials to use for the connection. Click Next.
    3. On the Operations page, expand the ORDERSO5 IDOC category (under \IDOC\ORDERS\). There are several versions of the IDOC available. For this tutorial, we’ll select ORDERS05.V3(700). Expand this IDOC, select Send, and then click the right arrow to add it to the Selected Operations box.

      Add Send operation for IDOCClick Next.

    4. In the Runtime Security page, specify the security mechanism to be used by the LOB Server to authenticate the target resource when a message arrives from a client. For this tutorial, select Fixed Username and specify the credentials to connect to the SAP server.
    5. On the Deployment page, you create an LOB Relay and an LOB Target to provide connectivity to your on-premise LOB applications from the cloud.

      Select the Create new option to create a new relay and provide the following values:

      Name Description
      Namespace Specify the Service Bus namespace on which the LOB relay endpoint is created.
      Issuer name Specify the issuer name for the Service Bus namespace
      Issuer secret Specify the issuer secret for the Service Bus namespace
      Relay path Specify a name for the relay. For this tutorial, enter sapintegration01.
      Target sub-path Enter a sub-path to make this target unique. For this tutorial, enter orders.

      The Target runtime URL read-only property displays the URL where the relay is deployed on Service Bus. This is the path where you could send a message to be inserted into the on-premises SAP Server. In our scenario, this is where the bridge sends the message.

      Click Next.

    6. On the Summary page, review the values you specified in the previous steps, and then click Create.
    7. When the wizard completes, click Finish.

      In Visual Studio Server Explorer, you now have an entry under the SAP node. This represents the relay endpoint created in Service Bus to relay PO messages coming from the cloud to the on-premises SAP system.

To add schemas

  1. After adding the relay endpoint to an SAP system, you must add schemas that to send ORDERS05 PO messages to the SAP server. To add the schemas, right-click the relay endpoint and select Add schemas to SAPIntegration. In the dialog box, do the following:
    • Enter a filename prefix that will be included in the name of each schema file that is generated. For this tutorial, specify this as SAPIntegration_.
    • Enter a folder name that will be added to your solution under which all the schemas will be added. For this tutorial, specify the folder name as LOB Schemas.
    • Enter the credentials to connect to an SAP system.

    Add schemas to the projectClick OK. The schemas are added to the project under an LOB Schemas folder.

To use the LOB Target

  1. Right-click anywhere on the BizTalk Service project design surface, select Properties and update the BizTalk Service URL property to include your BizTalk Services name. This is the name that you provided in Azure Management Portal while provisioning the BizTalk Services.
  2. Set the security property for the relay endpoint.
    1. Right-click the LOB Target in Server Explorer and select Properties.
    2. In the Properties grid, click the ellipsis (…) against the Runtime Security property.
    3. In the Edit Security dialog box, select Fixed Username and specify username and password to connect to the SAP Server.
    4. Click OK.
  3. Drag and drop the LOB Target onto the design surface. Note the Entity Name property of the LOB Target. The default value is Relay-Path_target-sub-path. If using the examples above, it will be sapintegration01_orders.
  4. Open the .config file for the LOB Target, which typically has the naming convention as YourRelayPath_target-sub-path.config. Specify the Service Bus issuer name and issuer secret, as shown below:
      <sharedSecret issuerName="owner" issuerSecret="issuer_secret" />
    
    

    Save changes to the config file.

 

Step 3: Transform the X12 850 PO Message to the ORDERS05 Message


Both the X12 850 schemas and ORDERS05 schemas are pretty complex and require functional expertise in the respective domains to understand and create maps between the two schemas.

While you already generated the schema for ORDERS05 IDOC, you can get the schema for X12 PO message (X12_00401_850.xsd) from the MicrosoftEdiXSDTemplates.zip that you must have downloaded and extracted before. You must add the X12_00401_850.xsd schema as well to the SAPIntegration project.

Creating a transform between the X12 850 PO and ORDERS05 PO requires functional domain knowledge of both the X12 schema and the ORDERS05 schema.

Only then one can identify which field in the X12 schema maps to which field in the ORDERS05 schema. In this tutorial, we do not get into such details and instead use an existing transform (AzureTransformations.trfm) between these two schemas. This transform is available as part of the SAPIntegration project that you can download from the MSDN Code Gallery.

To include the transform in the BizTalk Service project, right-click the project name, click Add, click Existing Items, and then navigate to the location where you downloaded the SAPIntegration sample from the MSDN Code Gallery. Select the AzureTransformations.trfm and then click Add.

Step 4: Create and Deploy the XML Bridge


In this topic, you create an XML One-Way Bridge that will act as a connector between the EDI Receive bridge and the relay endpoint for the ORDERS05 IDOC in SAP. After configuring the bridge, you connect it to the SAP relay endpoint, and then deploy the solution.

To configure the XML Bridge

  1. In the SAPIntegration project, from the Solution Explorer, double-click the MessageFlowItinerary.bcs file to open the bridge configuration surface.
  2. Right-click anywhere on the BizTalk Service project design surface, select Properties, and update the BizTalk Service URL property to include your BizTalk Services name. This is the name that you provided in Azure Management Portal while provisioning the BizTalk Services.
  3. From the Toolbox, drag and drop the XML One-Way Bridge component to the bridge design surface.
  4. Right-click the XML One-Way Bridge, select Properties, and change the value for Entity Name and Relative Address properties to B2BConnector. As a result, the complete endpoint URL where the bridge is deployed, which is shown in the Runtime Address property, will resemble https://<mybiztalkservicename>.biztalk.windows.net/default/B2BConnector. This is where the EDI Receive bridge sends the ORDERS05 PO message.
  5. Double-click the XML One-Way Bridge to open the Bridge Configuration design surface. Because this bridge only routes the message from the EDI Receive bridge to the relay endpoint, there’s not much configuration required for each stage in the bridge stage other than specifying the message types of the message that this bridge routes. To specify the message type, on the XML One-Way Bridge design surface, within the Message Types box, click the add icon [ Add icon ] to open the Message Type Picker dialog box.
  6. In the Message Type Picker dialog box, from the Available message types box, select the schema for the request message and then click the right arrow icon [ Arrow Icon ], and then click OK. For this tutorial, select the Send schema (http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ORDERS05//700/Send). The selected schema should now be listed under the Request Message Type box.
  7. Save the bridge configuration.

To connect the bridge to the relay endpoint

  1. In the SAPIntegration project, from the Toolbox, select the Connection component, and connect the XML One-Way Bridge component with the SAP relay endpoint you already added in Step 2: Expose a Relay Endpoint to Invoke Operations on ORDERS05 IDOC.
  2. Set the filter condition on the connection. The routing condition for this scenario is to route all messages to the LOB Target. To do so, select the connecting line, and from the Properties grid, click the ellipsis (…) against the Filter Condition property, and then select Match All. This ensures that all messages that come to the bridge are routed to the relay endpoint.
  3. Set the Route Action property on the connection. Before you set the route action, we must understand why it is required. The message sent from the EDI receive bridge to the relay endpoint must have the Action SOAP header set on it. This header defines what operation must be performed on the SAP system. The message that comes from the EDI receive pipeline does not have this header set. Hence, in this intermediary XML bridge, you set the route action on the message before it is sent the relay endpoint. As part of the route action, you add the required header on the message. Perform the following steps to set the route action.
    1. Find out the value that will be set for the Action SOAP header message. To do so, right-click the SAP relay endpoint from the Server explorer, and from the Properties grid, expand Operations, and copy the value. For this tutorial, the value is http://Microsoft.LobServices.Sap/2007/03/Idoc/3/ORDERS05//700/Send.

      Value for SOAP action

    2. Go back to the bridge configuration surface, select the connection between the bridge and the SAP relay, and from the Properties grid, click the ellipsis (…) against the Route Action property. In the Route Actions dialog box, click Add to open the Add Route Action dialog box. In the Add Route Action dialog box, do the following:
      • Under Property (Read From) section, select Expression and specify the value that you copied earlier.
        ImportantImportant
        Make sure you specify the value for Expression within single quotes.
      • Under Destination (Write-To) section, set the Type to SOAP and the Identifier to Action.

        Set Route Action

      • Click OK in the Add Route Action dialog box to add the route action. Click OK in the Route Actions dialog box and then click Save to save changes to an Enterprise Application Integration project.
  4. Save the project. The final bridge configuration resembles the following:

    Completed bridge configuration

To deploy the solution

  1. In Visual Studio, right click the SAPIntegration solution, and then click Build Solution.
  2. Once the build succeeds, right click the SAPIntegration solution, and then click Deploy Solution.
  3. In the deployment window, the Deployment Endpoint is a read-only property and the value is derived from the BizTalk Service URL/Namespace set in the message flow surface. However, you must provide the ACS Namespace for BizTalk Services, Issuer Name, and Shared Secret.
  4. Click Deploy. The Visual Studio Output pane displays the deployment progress and result. The URL where the bridge is deployed is also displayed in the Output pane. For this tutorial, the bridge is deployed at http://<mybiztalkservicename>.biztalk.windows.net/default/B2BConnector.

 

 

How To : Run SAP Applications on the Microsoft Platform

SAP on SQL General Update for Customers & Partners April 2014

SAP and Microsoft are continuously adding new features and functionalities to the SAP on SQL Server platform. The key objective of the SAP on Windows SQL port is to deliver the best performance and availability at the lowest TCO. This blog includes updates, fixes, enhancements and best practice recommendations collated over recent months.

sap2[1]SQL+Server+2014+Evolution[1]

Key topics in this blog: SQL Server 2014 First Customer Shipment program now available, Intel release new powerful E7 v2 processors, Windows 2012 R2 is now GA and customers are recommended to upgrade SQL Server 2012 CU to avoid a bug that can impact performance.

1.        SQL Server 2014 Release & Integration with SAP NetWeaver

SQL Server 2014 is now released and publically available. SAP are proceeding with the certification and integration of SQL Server 2014 with NetWeaver applications and other applications such as Business Objects.

Customers planning to implement SQL Server 2014 are advised to target upgrading to the following support pack stacks. Customers who wish to start projects on SQL Server 2014 or if early production support on SQL Server 2014 is required please follow the instructions in Note 1966681 – Release planning for Microsoft SQL Server 2014

Required SAP ABAP NetWeaver Support Package Stacks for SQL Server 2014

SAP SOFTWARE SUPPORT PACKAGE STACK (SPS)
SAP NETWEAVER 7.0 SPS 29, for SAP BW: SPS 30 (7.0 SPS 30 contains SAP_BASIS SP 30 and SAP_BW SP 32)
SAP EHP1 FOR SAP NETWEAVER 7.0 SPS 15, for SAP BW: SPS 15
SAP EHP2 FOR SAP NETWEAVER 7.0 SPS 14, for SAP BW: SPS 15
SAP EHP3 FOR SAP NETWEAVER 7.0 SPS 9
SAP NETWEAVER 7.1 SPS17
SAP EHP1 FOR SAP NETWEAVER 7.1 SPS 12, for SAP BW: SPS13
SAP NETWEAVER 7.3 SPS 10, for SAP BW: SPS11
SAP EHP1 FOR SAP NETWEAVER 7.3 SPS 9, for SAP BW: SPS11
SAP NETWEAVER 7.4 SPS 4, for SAP BW: SPS06

Java only systems do in general not require a dedicated support package stack for SQL Server 2014.

Note 1966701 – Setting up Microsoft SQL Server 2014

Note 1966681 – Release planning for Microsoft SQL Server 2014

New features in SQL Server 2014 are documented in a five part blog series here

Also see key new SQL Server Engine Features and SQL Server Managed Backups to Azure. Free trial Azure accounts can be created for backup testing. To test latency to the nearest Azure Region use this tool http://azurespeedtest.azurewebsites.net/

2.        New Enhancements for Performance and Functionality for SAP BW Systems

SAP BW customers can reduce database size, improve query performance and improve load/process chain performance by implementing SAP Support Packs, Notes and SQL Server Column Store.

All SAP on BW customers are strongly recommended to review these blogs:

SQL Server Column-Store: Updated SAP BW code

Optimizing BW Query Performance

Increasing BW cube compression performance

SQL Server Column-Store with SAP BW Aggregates

Performance of index creation

It is recommended to upgrade SAP BW systems to SQL Server 2012 or SQL Server 2014 and implement Column Store. Customer experiences so far have shown dramatic space savings of 20-45% and very good performance improvements.

BW customers with poor cube compression performance are highly encouraged to implement the fixes in Increasing BW cube compression performance

3.        SAP Note 1612283 Updated – IvyBridge EX E7 v2 Processor Released

Intel has released a new processor for 4 socket and 8 socket servers. The Intel E7 v2 IvyBridge EX processors almost doubles the performance compared to the previous Westmere EX E7 processor.

Even the largest SAP systems in the world are able to run on standard low cost Intel based hardware. Intel based systems are in fact considerably more powerful than high cost proprietary UNIX systems such as IBM pSeries. In addition to performance enhancements many reliability features have been added onto modern servers and built into the Windows operating system to allow Windows customers to meet the same service levels on Wintel servers that previously required high cost proprietary hardware.

  1. 4 socket HP DL580 Intel E7 v2 = 133,570 SAPSor 24,450 users
  2. 4 socket IBM pSeries Power7+ = 68,380 SAPS or 12,528 users

  3. 8 socket Fujitsu Intel E7 v2 = 259,680 SAPS or 47,500 users

Source: SAP SD 2 Tier Benchmarks http://global.sap.com/solutions/benchmark/sd2tier.epx   See end of blog for full disclosure

SAP Note 1612283 – Hardware Configuration Standards and Guidance has been updated to include recommendations for new Intel E5 v2, E7 v2 and AMD equivalent based systems. Customers are advised to ensure that new hardware is purchased with sufficient RAM as per the guidance in this SAP Note.

Most SAP Systems exhibit a ratio of Database SAPS to SAP application server SAPS of about 20% for DB and 80% for SAP application server. An 8 socket Intel server can deliver more than 200,000 SAPS, meaning one SAP on SQL Server system can deliver more than 1,000,000 SAPS. There are few if any single SAP systems in the world that are more than 1,000,000 SAPS, therefore these powerful platforms are recommended as consolidation platforms. This is explained in attachment to SAP Note 1612283. Many SQL databases and SAP systems can be consolidated onto a single powerful Windows & SQL Server infrastructure.

4.        SQL Server 2012 Cumulative Update Recommended

SQL Server 2012 Service Pack 1 CU3, CU4, CU5 & CU6 contains a bug that can impact performance on SAP systems.

It is strongly recommended to update to SQL Server 2012 Service Pack 1 CU 9 (the bug is fixed in CU7 & CU8 as well).

Microsoft will release a Service Pack 2 for SQL Server 2012 in the future

SQL Service Packs and Cumulative Updates can be found here: http://blogs.msdn.com/b/sqlreleaseservices/

Bug is documented here http://support.microsoft.com/kb/2895494

5.        Windows Server 2012 R2 – Shared Virtual Hard Disk for SAP ASCS Cluster

Windows Server 2012 R2 is now Generally Available for most NetWeaver applications and includes new features for Hyper-V Virtualization.

Feedback from customers has indicated that the vHBA feature offered in Windows 2012 requires that the OEM device drivers and firmware on the HBA card be up to date.

If the device drivers and firmware are not up to date, the vHBA can hang.

The SAP Central Services Highly Available cluster requires a shared disk to hold the /SAPMNT share. Therefore a shared disk is required inside a Hyper-V Virtual Machine.

There are now three options:

  1. iSCSI – generally only for small customers
  • vHBA – suitable for customers of any size, but driver/firmware must be up to date

  • Shared Virtual Hard Disk – available now in Windows 2012 R2. Simple to setup and configure RECOMMENDED

  • Windows 2012 R2 offers this feature and it is generally recommended for customers wanting to create guest clusters on Hyper-V to protect the SAP Central Services.

    SQL Server can also utilize Shared Virtual Hard Disk, however we generally recommend using SQL Server 2012 AlwaysOn for providing high availability

    Aidan Finn provides a useful blog on configuring Shared Virtual Hard Disk

    For more information about SAP on Hyper-V see this blog series How to Deploy SAP on Microsoft Private Cloud with Hyper-V 3.0 and SAP Note 1409608 – Virtualization on Windows

    6.        Important Notes for SAP BW Migration Customers

    Customers migrating SAP BW systems using R3load must pay particular attention to the SAP System Copy Notes and the supplementary SAP Note 888210 – NW 7.**: System copy (supplementary note)

    SAP BW and some other SAP components have special properties on some tables. These special properties are defined in DBMS specific definition files generated by SMIGR_CREATE_DDL.

    Prior to exporting a SAP BW system SMIGR_CREATE_DDL must be run. There are some important updates for the program SMIGR_CREATE_DDL that must be applied in the source system before the export. SAP Note 888210 will list all required notes. BW Systems running very old support packs must be checked very carefully and possibly other notes should be applied. The following notes should be implemented:

    1901705 – Long import runtime with certain tables on MSSQL

    1747330 – Missing data base indexes after system copy to MSSQL

    1993315 – SMIGR_CREATE_DDL: double columns in create index statements

    1771177 – SQL Server 2012 column-store support for SAP BW

    Customers migrating to SQL Server should review this blog: SAP OS/DB Migration to SQL Server–FAQ v5.2 April 2014

    7.        SQL Server AlwaysOn – Parameter AlwaysOn HealthCheckTimeout & LeaseTimeout. What are these values?

    SQL Server AlwaysOn leverages Windows Server Failover Cluster (WSFC) technology to determine resource health, quorum and control the status of a SQL Server Availability Group. The WSFC resource DLL of the availability group performs a health check of the primary replica by calling the sp_server_diagnostics stored procedure on the instance of SQL Server that hosts the primary replica. sp_server_diagnostics returns results at an interval that equals 1/3 of the health-check timeout threshold for the availability group. The default health-check timeout threshold is 30 seconds, which causes sp_server_diagnostics to return at a 10-second interval. If sp_server_diagnostics is slow or is not returning information, the resource DLL will wait for the full interval of the health-check timeout threshold before determining that the primary replica is unresponsive. If the primary replica is unresponsive or if the sp_server_diagnostics returns a failure level equal to or in excess of the configured failure level, an automatic failover is initiated.

    In addition to the above there is a further layer of logic to prevent another scenario:

    1. SQL Server Primary replica becomes extremely busy, so busy the operating system or SQL Server is saturated and cannot reply to the WSFC resource DLL within the configured period of time (default 30 seconds)
  • Windows Failover Cluster tries to stop SQL Server on the busy node, but is unable to communicate as the server is so busy. WSFC will assume the node has become isolated from the network and will start the failover process to and start SQL Server on another node. SQL Server is now running on another host

  • Eventually the condition causing the original host to be extremely busy finishes and client connections might continue to process on the first node (a very bad thing because we now have two “Primaries”)

  • To prevent this the Primary Node must connect to the WSFC resource DLL and obtain a lease periodically. This is controlled by the parameter LeaseTimeout. The Primary AlwaysOn Node must renew this lease otherwise the Primary will offline the database.

    Therefore there are two important parameters – HealthCheckTimeout and LeaseTimeout.

    Some customers have encountered problems with unexplained AlwaysOn failovers during activities such as initializing new Log Shipping or AlwaysOn Nodes via network or network backup while SQL Server is very busy. It is strongly recommended to use good quality 10G network cards, run Windows 2012 or Windows 2012 R2 and avoid using 3rd party network teaming utilities like HP Network Configuration Utility (NCU). In rare cases increasing both of these parameters may be needed.

    Additional Blog: SQL Server 2012 AlwaysOn – Part 7 – Details behind an AlwaysOn Availability Group

     

    8.        FusionIO Format Settings

    FusionIO or other in server SSD devices are now very common and are strongly recommended for customers that require high performance SQL Server infrastructure. The use of FusionIO and SSD is further recommended and detailed in Note 1612283 – Hardware Configuration Standards and Guidance and Infrastructure Recommendations for SAP on SQL Server: “in-memory” Performance.

    FusionIO devices are usually used for holding SQL Server transaction logs and tempdb. If the transaction log and tempdb datafiles and log files are placed on FusionIO we recommend:

    1. FusionIO card is formatted for maximum WRITE. This will reduce the usable space significantly
  • FusionIO physical level format should be 4k (not 8k – it is a proprietary format size)

  • Make sure server BIOS is set for MAX Fan blow out – FusionIO will slow down if it becomes hot (FusionIO device will throttle IO if temperature increases)

  • Update FusionIO and Server BIOS to latest

  • Format Windows disks NTFS Allocation Unit Size 64k

  • 9.        VHDX/VHD Format Settings

    Windows NTFS File System Allocation Unit Size default size is 4096 bytes. Smaller Allocation Unit Sizes (AUS) is more efficient for storing many small files. Larger AUS sizes such as 64k are more efficient for larger files.

    The file system holding the VHDX files for SQL Server virtual machines running on Hyper-V may benefit from a 64 kilobyte NTFS AUS size.

    The NTFS AUS of the file system inside the VHDX file must be 64 kilobytes

    The AUS size can be checked with the command below:

    fsutil fsinfo ntfsinfo <Drive letter>:

    10.        Do Not Install SAPGUI on SAP Servers

    Windows Servers have the ability to run many desktop PC applications such as SAPGUI and Internet Explorer however it is strongly recommended not to install this software on SAP servers, particularly production servers.

    1. To improve reliability of an operating system it is recommended to install as few software packages as possible. This will not only improve reliability and performance, but will also make debugging any issues considerably simpler
  • SAPGUI is in practice almost impossible to remove completely. SAPGUI installation installs DLLs into Windows directory

  • “A server is a server, a PC is a PC”. Customers are encouraged to restrict access to production servers by implementing Server Hardening Procedure. SAP Servers should not be used as administration consoles and there should be no need to directly connect to a server. Almost all administration can be done remotely

  •  

     

    Links

    How It Works: SQL Server AlwaysOn Lease Timeout

    http://blogs.msdn.com/b/psssql/archive/2012/09/07/how-it-works-sql-server-alwayson-lease-timeout.aspx

    Flexible Failover Policy for Automatic Failover of an Availability Group (SQL Server)

    http://technet.microsoft.com/en-us/library/hh710061.aspx

    Configure HealthCheckTimeout Property Settings

    http://technet.microsoft.com/en-us/library/ff878665.aspx

    Features from SharePoint 2010 Integration with SAP BusinessObjects BI 4.0

    ImageOne of the core concepts of Business Connectivity Services (BCS) for SharePoint 2010 are the external content types. They are reusable metadata descriptions of connectivity information and behaviours (stereotyped operations) applied to external data. SharePoint offers developers several ways to create external content types and integrate them into the platform.

     

    The SharePoint Designer 2010, for instance, allows you to create and manage external content types that are stored in supported external systems. Such an external system could be SQL Server, WCF Data Service, or a .NET Assembly Connector.

    This article shows you how to create an external content type for SharePoint named Customer based on given SAP customer data. The definition of the content type will be provided as a .NET assembly, and the data are displayed in an external list in SharePoint.

    The SAP customer data are retrieved from the function module SD_RFC_CUSTOMER_GET. In general, function modules in a SAP R/3 system are comparable with public and static C# class methods, and can be accessed from outside of SAP via RFC (Remote Function Call). Fortunately, we do not need to program RFC calls manually. We will use the very handy ERPConnect library from Theobald Software. The library includes a LINQ to SAP provider and designer that makes our lives easier.

    .NET Assembly Connector for SAP

    The first step in providing a custom connector for SAP is to create a SharePoint project with the SharePoint 2010 Developer Tools for Visual Studio 2010. Those tools are part of Visual Studio 2010. We will use the Business Data Connectivity Model project template to create our project:

    After defining the Visual Studio solution name and clicking the OK button, the project wizard will ask what kind of SharePoint 2010 solution you want to create. The solution must be deployed as a farm solution, not as a sandboxed solution. Visual Studio is now creating a new SharePoint project with a default BDC model (BdcModel1). You can also create an empty SharePoint project and add a Business Data Connectivity Model project item manually afterwards. This will also generate a new node to the Visual Studio Solution Explorer called BdcModel1. The node contains a couple of project files: The BDC model file (file extension bdcm), and the Entity1.cs and EntityService.cs class files.

    Next, we add a LINQ to SAP file to handle the SAP data access logic by selecting the LINQ to ERP item from the Add New Item dialog in Visual Studio. This will add a file called LINQtoERP1.erp to our project. The LINQ to SAP provider is internally called LINQ to ERP. Double click LINQtoERP1.erp to open the designer. Now, drag the Function object from the designer toolbox onto the design surface. This will open the SAP connection dialog since no connection data has been defined so far:

    Enter the SAP connection data and your credentials. Click the Test Connection button to test the connectivity. If you could successfully connect to your SAP system, click the OK button to open the function module search dialog. Now search for SD_RFC_CUSTOMER_GET, then select the found item, and click OK to open the RFC Function Module /BAPI dialog:

    SP2010SAPToBCS/BCS12.png

    The dialog provides you the option to define the method name and parameters you want to use in your SAP context class. The context class is automatically generated by the LINQ to SAP designer including all SAP objects defined. Those objects are either C# (or VB.NET) class methods and/or additional object classes used by the methods.

    For our project, we need to select the export parameters KUNNR and NAME1 by clicking the checkboxes in the Pass column. These two parameters become our input parameters in the generated context class method named SD_RFC_CUSTOMER_GET. We also need to return the customer list for the given input selection. Therefore, we select the table parameter CUSTOMER_T on the Tables tab and change the structure name to Customer. Then, click the OK button on the dialog, and the new objects get added to the designer surface.

    IMPORTANT: The flag “Create Objects Outside Of Context Class” must be set to TRUE in the property editor of the LINQ designer, otherwise LINQ to SAP generates the Customer class as nested class of the SAP context class. This feature and flag is only available in LINQ to SAP for Visual Studio 2010.

    The LINQ designer has also automatically generated a class called Customer within the LINQtoERP1.Designer.cs file. This class will become our BDC model entity or external content type. But first, we need to adjust and rename our BDC model that was created by default from Visual Studio. Currently, the BDC model looks like this:

    Rename the BdcModel1 node and file into CustomerModel. Since we already have an entity class (Customer), delete the file Entity1.cs and rename the EntityService.cs file to CustomerService.cs. Next, open the CustomerModel file and rename the designer object Entity1. Then, change the entity identifier name from Identifier1 to KUNNR. You can also use the BDC Explorer for renaming. The final adjustment result should look as follows:

    SP2010SAPToBCS/BCS4.png

    The last step we need to do in our Visual Studio project is to change the code in the CustomerService class. The BDC model methods ReadItem and ReadList must be implemented using the automatically generated LINQ to SAP code. First of all, take a look at the code:

    SP2010SAPToBCS/BCS6.png

    As you can see, we basically have just a few lines of code. All of the SAP data access logic is encapsulated within the SAP context class (see the LINQtoERP1.Designer.cs file). The CustomerService class just implements a static constructor to set the ERPConnect license key and to initialize the static variable _sc with the SAP credentials as well as the two BDC model methods.

    The ReadItem method, BCS stereotyped operation SpecificFinder, is called by BCS to fetch a specific item defined by the identifier KUNNR. In this case, we just call the SD_RFC_CUSTOMER_GET context method with the passed identifier (variable id) and return the first customer object we get from SAP.

    The ReadList method, BCS stereotyped operation Finder, is called by BCS to return all entities. In this case, we just return all customer objects the SD_RFC_CUSTOMER_GET context method returns. The returned result is already of type IEnumerable<Customer>.

    The final step is to deploy the SharePoint solution. Right-click on the project node in Visual Studio Solution Explorer and select Deploy. This will install and deploy the SharePoint solution on the server. You can also debug your code by just setting a breakpoint in the CustomerService class and executing the project with F5.

    That’s all we have to do!

    Now, start the SharePoint Central Administration panel and follow the link “Manage Service Applications”, or navigate directly to the URL http://<SERVERNAME>/_admin/ServiceApplications.aspx. Click on Business Data Connectivity Service to show all the available external content types:

    On this page, we find our deployed BDC model including the Customer entity. You can click on the name to retrieve more details about the entity. Right now, there is just one issue open. We need to set permissions!

    Mark the checkbox for our entity and click on Set Object Permissions in the Ribbon menu bar. Now, define the permissions for the users you want to allow to access the entity, and click the OK button. In the screen shown above, the user administrator has all the permissions possible.

    In the next and final step, we will create an external list based on our entity. To do this, we open SharePoint Designer 2010 and connect us with the SharePoint website.

    Click on External Content Types in the Site Objects panel to display all the content types (see above). Double click on the Customer entity to open the details. The SharePoint Designer is reading all the information available by BCS.

    In order to create an external list for our entity, click on Create Lists & Form on the Ribbon menu bar (see screenshot below) and enter CustomerList as the name for the external list.

    OK, now we are done!

    Open the list, and you should get the following result:

    The external list shows all the defined fields for our entity, even though our Customer class, automatically generated by the LINQ to SAP, has more than those four fields. This means you can only display a subset of the information for your entity.

    Another option is to just select those fields required within the LINQ to SAP designer. With the LINQ designer, you can access not just the SAP function modules. You can integrate other SAP objects, like tables, BW cubes, SAP Query, or IDOCs. A demo version of the ERPConnect library can be downloaded from the Theobald Software homepage.

    If you click the associated link of one of the customer numbers in the column KUNNR (see screenshot above), SharePoint will open the details view:

    SP2010SAPToBCS/BCS10.png

     

     

    How To : A library to create .mht files (available at request)

    There are a number of ways to do this, including hosting Word or Excel on the Web Server and dealing with COM Interop issues, or purchasing third – party MIME encoding libraries, some of which sell for $250.00 or more. But, there is no native .NET solution. So, being the curious soul that I am, I decided to investigate a bit and see what I could come up with. Internet Explorer offers a File / Save As option to save a web page as “Web Archive, single file (*.mht)”.

    Image

    What this does is create an RFC – compliant Multipart MIME Message. Resources such as images are serialized to their Base64 inline encoding representations and each resource is demarcated with the standard multipart MIME header – breaks. Internet Explorer, Word, Excel and most newsreader programs all understand this format. The format, if saved with the file extension “.eml”, will come up as a web page inside Outlook Express; if saved with “.mht”, it will come up in Internet Explorer when the file is double-clicked out of Windows Explorer, and — what many do not know — if saved with a “*.doc” extension, it will load in MS Word, each with all the images intact, and in the case of the EML and MHT formats, with all of the hyperlinks fully-functioning. The primary advantage of the format is, of course, that all the resources can be consolidated into a single file,. making distribution and archiving much easier — including database storage in an NVarchar or NText type field.

     

    System.Web.Mail, which .NET provides as a convenient wrapper around the CDO for Windows COM library, offers only a subset of the functionality exposed by the CDO library, and multipart MIME encoding is not a part of that functionality. However, through the wonders of COM Interop, we can create our own COM reference to CDO in the Visual Studio IDE, allowing it to generate a Runtime Callable Wrapper, and help ourselves to the entire rich set of functionality of CDO as we see fit.

     

    One method in the CDO library that immediately came to my notice was the CreateMHTMLBody method. That’s MHTMLBody, meaning “Multipurpose Internet Mail Extension HTML (MHTML) Body”. Well!– when I saw that, my eyes lit up like the LED’s on a 32 – way Unisys box! This is a method on the CDO Message class; the method accepts a URI to the requested resource, along with some enumerations, and creates a MultiPart MIME – encoded email message out of the requested URI responses — including images, css and script — in one fell swoop.

     

    “Ah”, you say, “How convenient”! Yes, and not only that, but we also get a free “multipart COM Interop Baggage” reference to the ADODB.Stream object – and by simply calling the GetStream method on the Message Class, and then using the Stream’s SaveToFile method, we can grab any resource including images, javascript, css and everything else (except video) and save it to a single MHT Web Archive file just as if we chose the “Save As” option out of Internet Explorer.

     

    If we choose not to save the file, but instead want to get back the stream contents, no problem. We just call Stream.ReadText(Stream.Size) and it returns a string containing the entire MHT encoded content. At that point we can do whatever we want with it – set a content – header and Response .Write the content to the browser, for instance — or whatever.

     

    For example, when we get back our “MHT” string, we can write the following code:

    Response.ContentType=”application/msword”;
    Response.AddHeader( “Content-Disposition”, “attachment;filename=NAME.doc”);
    Response.Write(myDataString);

     

    — and the browser will dutifully offer to save the file as a Word Document. It will still be Multipart MIME encoded, but the .doc extension on the filename allows Word to load it, and Word is smart enough to be able to parse and render the file very nicely. “Ah”, you are saying, “this is nice, and so is the price!”. Yup!

    And, if you are serving this MIME-encoded file from out of your database, for example, and you would like it to be able to be displayed in the browser, just change the “NAME.doc” to “NAME.MHT”, and don’t set a content-type header. Internet Explorer will prompt the user to either save or open the file. If they choose “open”, it will be saved to the IE Temporary files and open up in the browser just as if they had loaded it from their local file system.

     

    So, to answer a couple of questions that came up recently, yes — you can use this method to MHTML – encode any web page – even one that is dynamically generated as with a report — provided it has a URL, and save the MIME-encoded content as a string in either an NVarchar or NText column in your database. You can then bring this string back out and send it to the browser, images,css, javascript and all.

    Now here is the code for a small, very basic “Converter” class I’ve written to take advantage of the two scenarios specified above. Bear in mind, there is much more available in CDO, but I leave this wondrous trail of ecstatic discovery to your whims of fancy:

    using System;
    using System.Web;
    using CDO;
    using ADODB;
    using System.Text;
    namespace PAB.Web.Utils
    {
     public class MIMEConverter
     {
      //private ctor as our methods are all static here
      private MIMEConverter()
      {
       
      }   
      public static bool SaveWebPageToMHTFile( string url, string filePath)
      {
       bool result=false;
       CDO.Message  msg = new CDO.MessageClass(); 
       ADODB.Stream  stm=null ;
       try
       {
        msg.MimeFormatted =true;   
        msg.CreateMHTMLBody(url,CDO.CdoMHTMLFlags.cdoSuppressNone, "" ,"" );
    stm = msg.GetStream(); stm.SaveToFile(filePath,ADODB.SaveOptionsEnum.adSaveCreateOverWrite); msg=null; stm.Close(); result=true; } catch {throw;} finally { //cleanup here } return result; } public static string ConvertWebPageToMHTString( string url ) { string data = String.Empty; CDO.Message msg = new CDO.MessageClass(); ADODB.Stream stm=null; try { msg.MimeFormatted =true; msg.CreateMHTMLBody(url,CDO.CdoMHTMLFlags.cdoSuppressNone,
    "", "" );
    stm = msg.GetStream(); data= stm.ReadText(stm.Size); } catch { throw; } finally { //cleanup here } return data; } } }

     

    NOTE: When using this type of COM Interop from an ASP.NET web page, it is important to remember that you must set the AspCompat=”true” directive in the Page declaration or you will be very disappointed at the results! This forces the ASP.NET page to run in STA threading model which permits “classic ASP” style COM calls. There is, of course, a significant performance penalty incurred, but realistically, this type of operation would only be performed upon user request and not on every page request.

    <

    p align=”left”>The downloadable zip file below contains the entire class library and a web solution that will exercise both methods when you fill in a valid URI with protocol, and a valid file path and filename for saving on the server. Unzip this to a folder that you have named “ConvertToMHT” and then mark the folder as an IIS Application so that your request such as “http://localhost/ConvertToMHT/WebForm1.aspx&#8221; will function correctly. You can then load the Solution file and it should work “out of the box”. And, don’t forget – if you have an ASP.NET web application that wants to write a file to the file system on the server, it must be running under an identity that has been granted this permission.

    In Depth Look : Private Cloud Infrastructure as a Service Capabilities

    saas[1]

     

    The primary purpose of a Private Cloud Infrastructure as a Service capability is to provide well managed infrastructure services to the Platform and Software Layers. To achieve this, the Infrastructure Layer, highlighted in the Private Cloud Reference Model diagram below, includes five capabilities.


    Figure 1: Private Cloud Reference Model

    This document describes these Infrastructure Layer capabilities and the impact of Private Cloud Infrastructure as a Service (IaaS) patterns on their planning and design. These patterns are defined in the Private Cloud Principles, Concepts, and Patterns document and are summarized here:

    • Resource Pooling: Divides resources into partitions for management purposes.
    • Physical Fault Domain: The group of physical resources dependent on a single point of failure such as an Uninterruptible Power Supply (UPS).
    • Upgrade Domain: A group of resources upgraded as a single unit.
    • Reserve Capacity: Unallocated resources, which take over service in the event of a failed Physical Fault Domain.
    • Scale Unit: A collection of resources treated as a single unit of additional capacity.
    • Capacity Plan: A model that enables a private cloud to deliver the perception of infinite capacity.
    • Health Model: Defines how a service or system may remain healthy.
    • Service Class: Defines services delivered by Infrastructure as a Service.
    • Cost Model: The financial breakdown of a private cloud and its services.

    The Health Model, Service Class, and Scale Unit patterns directly affect Infrastructure and are detailed in the relevant sections later. Conversely, private cloud infrastructure design directly affects Physical Fault Domains, Upgrade Domains, and the Cost Model. These relationships are shown in Figure 2 below.


    Figure 2: Infrastructure Relationship with Patterns

    Background

    The private cloud principles “perception of continuous availability” and “resiliency over redundancy mindset” are designed to make a private cloud architect think differently.

    Traditional solutions rely heavily on redundancy to achieve high availability and avoid failure. But redundancy at the facility (power) and infrastructure (network, server, and storage) layers is very costly. Modern cloud applications are designed with a different, holistic approach to achieving availability. This means shifting focus from building redundancy into the facility and physical infrastructure to engineering the entire solution to handle failures — eliminating them, or at least minimizing their impact.

    This approach to availability relies on resilience as well as redundancy. Resilience means rapid, and ideally automatic, recovery from a failure. Redundancy is typically achieved at the application level. (A non-cloud example is Active Directory®, where redundancy is achieved by providing more domain controllers than is needed to handle the load.)

    Customer interest in cost reduction will help drive adoption of this approach over the medium term. Removing power redundancy from racks or co-location rooms has a big impact on operational expenses, but this typically occurs only when the hosted application doesn’t have to be highly available, or when high availability is achieved through redundancy at the application layer – for example, Active Directory replication, or application layer mirroring such as SQL Server™ mirroring. Combining reductions in physical redundancy with virtualization results in lower capital and operational expenditure compared to a highly redundant infrastructure.

    Applications that depend on a highly available infrastructure will not achieve their Service-Level Agreement (SLA) when placed on the type of infrastructure defined earlier. Customers are therefore likely to develop two environments when designing their private cloud: a standard environment with reduced facility and infrastructure redundancy, and a high-availability environment with traditional levels of redundancy.

    Standard Environment

    High-Availability Environment

    No power redundancy to the rack (for example one in-rack UPS) Redundant power to each server
    No network redundancy to the servers (redundant core network) Redundant network connections to each server
    Local storage, possibly redundant storage, and storage network Redundant storage presented to each server
    Ideally no migration or possibly quick migration Live Migration

    These two environments allow a Architect to differentiate service classifications from a high-availability perspective. The standard environment is appropriate for stateless workloads; stateful workloads will require the high availability environments. Stateful and stateless machines are managed differently. Statefulness will likely appear as a characteristic of the service classifications.

    Stateless workloads (web servers, for example) are typically redundant at the server level via a load-balanced farm. These servers could easily be hosted in the Standard Environment. If all stateless workloads had an automated build, the Standard Environment could do away with any form of VM migration – and simply deploy another VM after destroying the existing one, thereby saving the cost of shared storage.

    Stateful workloads, on the other hand, require a specific management approach and impose higher costs on the consumer. Unless designed for high availability at the application level, they will require some form of redundancy in the infrastructure. Further, the High-Availability Environment requires Live Migration to enable maintenance of the underlying fabric and load balancing of the VMs.

    Security

    The number one concern of customers considering moving services to the cloud is security. Recent concerns expressed in the industry forums are all well founded and present reasons to think through the end to end scenarios and attack surfaces presented when deploying multiple services from various departments in an organization on a private cloud.

    In a cloud-based platform, regardless of whether it is private or public cloud, customers will be working on an essentially virtualized environment. The platform or software will run on top of a shared physical infrastructure managed internally or by the service provider. The security architecture used by the applications will need to move up from the infrastructure to the platform and application layers. In private cloud security this will provide security in addition to the perimeter network.

    Public cloud involves handing over control to a third party, sharing services with unrelated business entities or even competitors and requires a high degree of trust in the providers security model and practices. In many ways the security concerns of private cloud and similar those of self-hosted or outsourced datacenter however the move to a virtualized self-service service oriented paradigm inherent in private cloud computing introduces some additional security concerns.

    First is the isolation of tenants from each other and the hosting infrastructure at both the compute and network layers. Virtualization is a part of any private cloud strategy and the security of this model is totally dependent on the ability to isolate one tenant from another and prevent the careless or malicious tenant from impacting the stability of the core infrastructure upon which all tenants rely.

    Another concern is Authentication, Authorization and Auditing of access to the cloud services. Self-service implies that tenant administrators can initiate management processes and workflows where previously this was accomplished through IT. For any misconfiguration or excessive permissions granted to these users can impact the stability or security of the cloud solution.

    Many private cloud security concerns are also shared by traditional datacenter environment which is not surprising since the private cloud is just an evolution of the traditional datacenter model. These include:

    • Impact the confidentiality, integrity or availability through exploitation of software vulnerabilities.
    • Unauthorized access due to weak or misconfiguration.
    • Impact to confidentiality, integrity or availability by malicious code.
    • Impact to confidentiality, integrity or availability of data.
    • Compliance with internal or industry specific regulations and standards.

    Secure Virtualization Platform

    The biggest risk in running in a multi-tenant virtualized environment is that a tenant running services on the same physical infrastructure as you can break out of its isolating partition and impact the confidentiality, integrity or availability of your workload and data. Therefore the security in virtualization platform is key in the isolation and non-interference between the individual virtual machines running on the infrastructure.

    Highly Automated Management, Monitoring and Reporting

    Many management tasks involve multiple steps that must be completed in the proper sequence by multiple administrators across multiple systems. Any shortcuts, omissions or errors can leave assets vulnerable to unauthorized access or affect the reliability of components within a solution. By orchestrating discreet management and monitoring tasks into workflows that require proper authorization and approval greatly diminish the chance of mistakes that affect the security of the solution.

    Authentication, Authorization and Auditing

    Most organizations have a common capability for providing an overarching framework for authentication and access control and then a private cloud introduces all parts of hosting and hosted services that include the hosting infrastructure and the virtual machines workloads that run in that infrastructure. This framework must be designed and possibly extended to provide a single point of managing identities and credentials, authentication services and common security model for access to resources across the private cloud.

    Multi-layer Security

    Moving to a cloud-based platform requires a change in mind-set of developers and IT security professionals. Some of the risks of the public cloud are mitigated by using a private cloud architecture, however, the perimeter security protecting a private cloud should be seen as an addition to public cloud security practices, not an alternative. You cannot apply the traditional defense-in-depth security models directly to cloud computing, however you should still apply the principal of multiple layers of security. By taking a fresh look at security when you move to a cloud-based model, you should aim for a more secure system rather than accepting security that continues with the current levels.

    Security Governance

    Enterprise IT systems are now typically well regulated and controlled. The security risks are well documented and therefore proper processes are put in place to develop new applications and systems, or to provision them from 3rd party vendors. It is very unlikely that a department manager would be able to purchase and install software without approval from the IT department.

    With public cloud systems and Web browser clients however, it is possible that individual department managers could bypass the IT department and provision public cloud-based software. Indeed, they might use free cloud storage systems as a convenient means to synchronize documents without even considering that they are using public cloud services. Public cloud systems might be appealing to a manager as they could very quickly provision a new system and remove what they might see as unnecessary bureaucracy. They may even be unaware of the security and compliance policies that are in place to protect the organization. In a cloud-based landscape, we must protect corporate systems and data from these unauthorized, untested systems.

    Facilities

    Facilities represent the physical components – buildings, racks, power, cooling, and physical interconnects – that house or support a private cloud. It is beyond the scope of this document to provide detailed guidance on facilities, but the private cloud principles affect facility design.

    The definition of a Scale Unit impacts power, cooling, space, racking, and cabling requirements. The team that defines a Scale Unit should include personnel that design and manage these aspects of the facility in addition to the procurement, Capacity Planning, and Service Delivery teams. The following table lists some ramifications of Scale Unit size choices from a facilities perspective.

    Small Scale Unit

    Large Scale Unit

    Benefit

    Trade-off

    Benefit

    Trade-off

    • Lower amount of physical labor needed to add a Scale Unit
    • Complicates the Resource Pool, Fault Domain, and Reserve Capacity equation
    • Inefficient
    • Stranded power (un-utilized power)
    • Un-utilized space
    • Allocation of full facilities units (for example, UPS, Rack, and Co-location Room) is easy to cost and engineer
    • Reduces under-utilization of power, cooling, and space
    • Higher amount of labor to commission

    Knowing how much power, cooling, and space each Scale Unit will consume enables the facilities team to perform effective Capacity Planning and the engineering team to effectively plan resources.

    Compute, Network, and Storage Fabric

    The term Fabric defines a collection of interconnected compute, network, and storage resources.

    The concept of homogenous physical infrastructure, introduced in the Private Cloud Principles, Concepts, and Patterns guide, stipulates that all servers in a Resource Pool should be identical. Homogenizing the compute, storage, and network components in servers allows for predictable scale and performance. In other words, every server in a Resource Pool should have the same processor characteristics such as family (Intel/AMD), number of cores/CPUs, and generation (Xeon 2.6 Gigahertz (GHz)). The homogenized compute concept also stipulates that each server have the same amount of Random Access Memory (RAM) and the same number of connections to Resource Pool storage and networks. With these specifications met, any virtualized service could relocate from one failing or failed physical server to another physical server and continue to function identically.

    Physical Server

    The physical server hosts the hypervisor and provides access to the network and shared storage. In the Standard Environment, the facilities do not provide power redundancy, so the servers do not require dual power supplies.

    Every server will be a member of a single compute Resource Pool and a single Physical Fault Domain. Assuming all servers are homogeneous (as recommended), they will all be members of a single Upgrade Domain.

    Capacity Planning must be done for each server specification, as its size (CPU and RAM specification) will determine how many virtual machines it is able to host. This is covered in greater detail in the Private Cloud Planning Guide for Service Delivery.

    Server specification selection impacts the Scale Unit, Cost Model, and service class. Scale Units have a finite amount of power and cooling, so server efficiency has an impact on a private cloud. It may be that all power in a Scale Unit is consumed before all physical space. The cost of servers impacts the Cost Model irrespective of whether this cost is passed onto the consumer. Selecting only small one-unit servers will limit the architect’s ability to define a range of service classifications. The server needs to accommodate the largest service classification after the parent partition and hypervisor consume their resources.

    Microsoft research shows servers with processors one or two models behind the latest versions offer a better price, performance, and power consumption ratio than the newer processors.

    The Private Cloud Reference Architecture dictates that the “concept of homogenization of physical infrastructure” be adopted for each Resource Pool. Server specifications (CPU, RAM) may vary between Resource Pools, but this complicates Fabric Management (defined in the Private Cloud Planning Guide for Systems Management), which spans Resource Pools and Capacity Planning, and may necessitate different service classes for each pool.

    Delivering IaaS requires that the service is pre-defined and delivered consistently. To achieve consistent performance, the VMs must have equal resources available to them from each server, in other words, the same CPU cycles and RAM. If servers within a Resource Pool do not provide homogeneous performance and RAM, consistent performance cannot be guaranteed.

    Absolute homogenization may be hard to maintain over the long term as server models may be discontinued by the vendor; therefore relationships between Resource Pools, Scale Units, and server model longevity must be considered carefully.

    The following table lays out some of the benefits and trade-offs of homogeneous and heterogeneous Resource Pools.

    Homogeneous Physical Infrastructure

    Heterogeneous Physical Infrastructure

    Benefit

    Trade-off

    Benefit

    Trade-off

    • Predictable performance within a Resource Pool
    • Guaranteed Live Migration across the fabric
    • Reuse of existing equipment may not be possible
    • Possible reuse of existing equipment
    • Allows for a broader range of server classes
    • VMs cannot be moved between Resource Pools
    • More upfront work to make sure Live Migration will work appropriately

    In addition, servers should support the following requirements to achieve an automated infrastructure and resiliency:

    Automated Infrastructure

    • Wake On Local Area Network
    • Remote BIOS Upgrades/Configurations
    • Boot from Flash
    • Pre-Boot Execution Environment (PXE) for remote imaging
    • Virtualization Support
      • Data Execution and Prevention
      • 64 bit CPUs
    • Standard Environment: 2 Network adapters that support TCP offload (TOE)
      • Management x 1
      • Consumer x 1
    • High-Availability Environment: 4 or 6 redundant network adapters that support TOE
      • Management x 2: Could be teamed for redundancy
      • Live Migration x 2: Could be teamed for redundancy
      • Consumer x 2: Could be teamed for resiliency
    • Standard Environment: Storage connections that meet the required service classification
      • For Internet Small Computer System Interface (iSCSI), 1 x Hardware iSCSI initiators: Could use vendor-specific software to achieve resiliency
      • For Fiber Channel, 1 x Fiber Channel host bus adapter (HBA): – Could use vendor-specific software to achieve resiliency
    • High-Availability Environment: Redundant storage connections that meet the required service classification
      • For iSCSI, 2 x Hardware iSCSI initiators: Could use vendor-specific software to achieve resiliency
      • For Fiber Channel, 2x Fiber Channel HBAs: Could use vendor-specific software to achieve resiliency

    To dynamically initiate remediation events in case of failure or impending failure of server components, each server is required to display warnings, errors, and state information for the following:

    • CPU
      • State (Busy/Ready)
      • Utilization
      • Heat
      • Fans
    • RAM
      • Utilization
      • Error-Correcting Code (ECC) Errors
    • Storage
      • Read/Write Failures
      • Predictive Failures
    • Network Interface Cards (NICs)
      • Port State
      • Send/Receive Errors
    • Motherboard
      • Server Post Errors
    • Power Supply
      • State
      • Active / Passive
      • Power Output Variations
    • Fans
      • Speed
      • State

    Storage

    To achieve the perception of infinite capacity, proactive Capacity Management must be performed, and storage capacity added ahead of demand. The amount of storage added as a single unit (a Storage Scale Unit) will depend on the rate of storage consumption, hardware vendor lead time, and the level of risk the business wishes to assume (that is, weighing remaining unallocated capacity against the possibility of exhausting all capacity). This is detailed in Private Cloud Planning Guide for Service Delivery.

    Storage will be placed in Storage Resource Pools, from which it is automatically allocated to consumers. Though Resource Pools are not a new concept for Storage Area Networks (SANs), allowing the infrastructure to allocate storage on-demand based on policy may be a new approach for many organizations. Further, the SAN must present an application programming interface (API) to Fabric Management to allow automation of allocation and provisioning.

    The storage provided within a private cloud must be consistent in performance and availability. This means the Input/output (I/O) Operations per Second (IOPS) cannot vary significantly. If there is a need to make different levels of storage performance available to users of a private cloud, it can be accomplished through multiple service classifications. A private cloud is intended, however, to provide a limited set of standardized services; therefore, variances should be carefully considered.

    The cost of providing the storage within a private cloud should be clearly defined. This permits metering, and possibly allocation of costs to consumers. If different classes of storage are provided for different levels of performance, their costs should be differentiated. For example, if SAN is being used in an environment, it is possible to have storage tiers where faster Solid State Drives (SSD) are used for more critical workloads. Less-critical workloads can be placed on a Tier 2 Secure Attention Sequence (SAS), and even less-critical workloads on Tier 3 SATA drives.

    The Private Cloud Reference Architecture assumes the storage arrays and the storage network are redundant, with no single point of failure beyond the array itself. In this regard, the storage array can be considered a Fault Domain.

    The design should adopt some form of de-duplication technology to reduce storage consumption.

    As the storage array is a single point of failure, it should display health information to the systems monitoring service to make sure that any outages and their impact are quickly identified. Providing snapshots and mirroring between arrays for continuity is beyond the scope of this guidance.

    Physical Storage Switches

    If a Architect follows the recommendation to allow any VM to execute on any server in a Resource Pool, Virtual Hard Disks (VHDs) should reside on a SAN. While it is possible to host VHDs locally, the guidance assumes that they are hosted on a SAN.

    A key decision in private cloud design is whether to use iSCSI or Fiber Channel for storage. If iSCSI is utilized to house virtual workload storage, it is suggested that each virtualization host include iSCSI HBAs instead of standard NICs for performance reasons.

    The purpose of a storage switch is to provide resilient and flexible connectivity between shared storage and physical servers. The storage switch must meet peak storage I/O requirements for the virtual services. In addition, the interconnect speeds between switches should be evaluated to determine the maximum throughput for switch-to-switch communications. This may limit the maximum number of hosts that can be placed on each switch.

    While switch throughput is important, attention should also be paid to the number of available switch ports needed to support the physical virtualization hosts. Refer to the switch hardware vendor to make sure it meets these requirements.

    Physical storage switch requirements include:

    • Dedicated switch port on each switch for each host and storage processor connection. This is needed for redundancy and I/O optimization.
    • iSCSI traffic separated from all other IP traffic, preferably on its own switched infrastructure or logically through a virtual local area network (VLAN) on a shared IP switch. This segregates data access from traditional network communications for host-to-host and workload operations and provides data security.
    • Redundant power supplies and cooling fans increase the number of faults the storage switch can withstand.
    • Programmatic interface to support firmware upgrades and configuration.

    Physical Storage Subsystem

    Stateless workloads can be hosted on Direct-Attached Storage (DAS) instead of SAN, driving down the cost of service. The downside is that Fabric Management has to handle transitioning active user connections between VMs homed on different hosts, as VM migration is impossible. This may mean tighter integration with the network than is specified in this document (in order to know when all connections to a VM have been abandoned or terminated before stopping the VM, for example).

    SAN storage, while more expensive, provides advantages:

    • The VM can be re-homed to other servers.
    • Live Migration can be employed.
    • Backup (of the VM) can occur out of band (for example, taking snapshots).
    • Capacity can be increased almost limitlessly.

    The logical storage configuration (or storage classification) should be designed to meet requirements in the following areas:

    • Capacity: To provide the required storage space for the virtual service data and backups.
    • Performance Delivery: To support the required number of IOPS and throughput.
    • Fault Tolerance: To provide the desired level of protection against hardware failures. If a SAN is used, this may include redundant HBA and switches.
    • Manageability: To provide a high degree of platform self-management. This requires a programmatic interface to provide automated configuration and firmware upgrades.

    Additionally, a private cloud must meet the following requirements to make sure that it is highly available and well-managed:

    • Multiple paths to the disk array for redundancy. Should a disk fail, hot or warm spare disks can provide resiliency in the provisioned storage. Consult the storage vendor for specific recommendations.
    • A storage system with automatic data recovery, to allow an automatic background process to rebuild data onto a spare or replacement disk drive when another disk drive in the array fails.
    • Redundant power supplies and cooling fans, to increase the number of faults the Storage Array can withstand.

    Network

    The Private CLoud Reference Architecture assumes that the network presented to servers is not redundant for the Standard Environment and is redundant for the High-Availability Environment.

    The network is tightly coupled with physical servers. Each Compute Resource Pool includes the network switches necessary for the servers to operate; each Scale Unit includes a pre-defined and fixed number of servers and switches.

    The switches must be monitored to make sure no workloads saturate the network. A private cloud is designed as a general-purpose infrastructure. Workloads that challenge the network with high utilization may not be good candidates for a private cloud unless separate Resource Pools are created specifically to handle these workloads.

    Switches are members of network upgrade domains, but the definition and membership of upgrade domains will likely vary depending on the nature of the upgrade. If switches are not redundant (for example, in the Standard Environment), the whole Resource Pool will need to be taken offline for switch maintenance, which requires switch reboots.

    Network hardware (switches and load balancers) must display an API to Fabric Management that enables automated management of networks such as creation of VLANs, Virtual IP addresses (VIP), and addition or removal of hosts from the VIP.

    Physical Network

    Some key decisions that should be made to increase the bandwidth of the physical networks are related to the use of Live Migration requirements of port security, and the need for link aggregation. Here is a table showing the benefits and trade-offs of using Live Migration:

    Use Live Migration

    Do Not Use Live Migration

    Benefit

    Trade-off

    Benefit

    Trade-off

    • Transparent movement of Stateful applications
    • Transparent infrastructure upgrades
    • Additional network switch ports will be required
    • More network adapters are required per virtualization host
    • Greater Reserve Capacity may be required because of cluster size limitations of 16 nodes
    • Less switch ports are required
    • Fewer network adapters are required per virtualization host
    • Ideal for stateless applications
    • No transparent movement of Stateful applications
    • For Stateful applications, infrastructure upgrades will need to be coordinated with VM owners

    To support the dynamic characteristics of a private cloud, a network switch should support a remote programmatic interface – for firmware upgrades, and prioritization of traffic for quality of service. These switches should be dedicated for a private cloud to maintain predictable performance and to minimize risks associated with human interaction. As defined earlier, the servers need to be connected to at least two networks, management and consumer, with live migration (if required). The connections should always be the same; for example, network adapter 1 to management, network adapter 2 to consumer, and network adapter 3 to Live Migration.

    If iSCSI is chosen for the storage interconnects, iSCSI traffic should reside in an isolated VLAN in order to maintain security and performance levels. This iSCSI traffic should not share a network adaptor with other traffic, for example the management or consumer network traffic.

    The interconnect speeds between switches should be evaluated to determine the maximum bandwidth for communications. This could affect the maximum number of hosts which can be placed on each switch.

    When designing network connectivity for a well-managed infrastructure, the virtualization hosts should have the following specific networking requirements:

    • Support for 802.1Q VLAN Tagging: To provide network segmentation for the virtualization hosts, supporting management infrastructure and workloads. This is the preferred method to help secure and isolate data traffic for a private cloud.
    • Remote Out-of-band Management Capability: To monitor and manage servers remotely over the network regardless of whether the server is turned on or off.
    • Support for PXE Version 2 or Later: To facilitate automated server provisioning.

    To dynamically initiate remediation events in response to the failure or impending failure of network switch components, each switch is required to display warnings, errors, and state information for the following:

    • CPU
      • Utilization
      • Temperature
    • Flash Memory
      • Utilization
    • Interface Details
      • Port State
      • Port Errors
      • Bandwidth Utilization
    • Power Supply
      • State
      • Active / Passive
      • Power Output Variations
    • Fans
      • Speed
      • State

    Storage Switch/Subsystem Health Model

    To dynamically initiate remediation events in response to either the failure or impending failure of storage switches and storage subsystem components, each component is required to display warnings, errors, and state information for the following:

    Storage Switch

    • CPU
      • Utilization
      • Temperature
    • Flash Memory
      • Utilization
    • Interface Details
      • Port State
      • Port Errors
      • Bandwidth Utilization
    • Power Supply
      • State
      • Active / Passive
      • Power Output Variation
    • Fans
      • Speed
      • State

    Storage Subsystem

    • CPU
      • Utilization
      • Temperature
    • Flash Memory
      • Utilization
    • Service Processor
      • State
      • Errors
      • IOPS
    • Disks
      • Read / Write Failures
      • Predictive Failures
    • Power Supply
      • State
      • Active / Passive
      • Power Output Variations
    • Fans
      • Speed
      • State

    Hypervisor

    The hypervisor exposes the VM services to consumers. It needs to be configured identically on all hosts in a Resource Pool, and ideally all hosts in the private cloud. Fabric Management will orchestrate the addition of virtual switches, machines, and disks.

    An architect needs to decide whether the private cloud should use CPU Resource Reservations to make sure of predictable performance of VMs. This table lists the benefits and trade-offs:

    Use CPU Resource Reservations

    Do Not Use CPU Resource Reservations

    Benefit

    Trade-off

    Benefit

    Trade-off

    • Consistent VM performance for consumers
    • Fixed number of VMs per host might lead to low utilization of resources
    • Toolset may not set resource reservations
    • Variable number of VMs per host means resource utilization can be maximized
    • Consumers do not experience consistent VM performance
    • One VM can adversely affect the processing performance of others

    The decision is driven by whether efficiency or consistency is more important for the private cloud.

    The architect could elect to provide different classes of services – one which uses resource reservations to deliver predictability, and another which shares the resources. Separate Resource Pools could be deployed accordingly, along with differential pricing to incent the consumers to exhibit desired behavior.
    Resource reservations will not prevent a host from saturating the network and crippling the performance of other hosts. As stated in the Network section earlier, this needs monitoring.

    Parent Partition

    The parent partition provides the hypervisor with access to physical resources such as network and storage. It also hosts the hypervisor management interfaces. The parent partition needs to be configured identically on all servers in a Resource Pool.

    If an architect elects to create a service classification which depends on consuming LUNs directly (not via the parent partition), the parent partition must be configured to present the pass-through for this storage. Further, this storage must be available to all parent partitions in that Resource Pool to enable VM portability between hosts.

    The parent partition displays health information for the server, the parent partition operating system, and the hypervisor. The health monitoring system, in turn, consumes this information to enable Capacity Management and Fabric Management.

    Management Layers

    Task Execution

    Task execution is the low level management operations that can be performed on a platform and generally are surfaced through the command line or Application Programming Interface (API). The capability to execute tasks must not only exist but the usage semantics should be consistent across members of a fault domain to enable automation using a common format. When differences in semantics exist this forces the automation layer to compensate for these differences through custom code in the orchestration or even require using different execution hosts or engines within a fault domain.

    Automation

    The automation layer is made up of the foundational automation technology plus a series of single purpose commands and scripts that perform operations such as starting or stopping a virtual machine, restarting a server, or applying a software update. These atomic units of automation are combined and executed by higher-level management systems. The modularity of this layered approach dramatically simplifies development, debugging, and maintenance.

    Orchestration

    In much the same way that an enterprise resource planning (ERP) system manages a business process such as order fulfillment and handles exceptions such as inventory shortages, the orchestration layer provides an engine for IT-process automation and workflow. The orchestration layer is the critical interface between the IT organization and its infrastructure and transforms intent into workflow and automation.

    Ideally, the orchestration layer provides a graphical user interface in which complex workflows that consist of events and activities across multiple management-system components can be combined, to form an end-to-end IT business process such as automated patch management or automatic power management. The orchestration layer must provide the ability to design, test, implement, and monitor these IT workflows.

    Service Management

    Service management provides the means for automating and adapting IT service management best practices, such as those found in the IT Infrastructure Library (ITIL), to provide built-in processes for incident resolution, problem resolution, and change control.

    Self Service

    Self Service capability is a characteristic of private cloud computing and must be present in any implementation. The intent is to permit users to approach a self-service capability and be presented with options available for provisioning in an organization. The capability may be basic where only provisioning of virtual machine with a pre-defined configuration or may be more advanced allowing configuration options to the base configuration and leading up to a platform capability or service.

    Self service capability is a critical business driver that enables members of an organization to become more agile in responding to business needs with IT capabilities to meet those needs in a manner that aligns and conforms with internal business IT requirements and governance.

    This means the interface between IT and the business are abstracted to simple, well defined and approved set of service options that are presented as a menu in a portal or available from the command line. The business selects these services from the catalog, begins the provisioning process and notified upon completions, the business is then only charged for what they actually use.

    This is analogous to capability available on Public Cloud platforms.

    The entities that consume self service capabilities in an organization are individual business units, project teams, or any other department in the organization that have a need to provision IT resources. These entities are referred to as Tenants. In a private cloud tenants are granted the ability to provision compute and storage resources as they need them to run their workload. Connectivity to these resources is managed behind the scenes by the fabric management layers of the private cloud.

    Tenant administrators are granted access to a self-service portal where they can initiate workflows to provision virtualized services in the appropriate configuration and capacity. For example compute resources may be available in small, medium or large instance capacities and also storage of the appropriate size and performance characteristics. Resources are provisioned without any intervention from infrastructure personnel in IT and the overall progress is tracked and reported by the fabric management layer and reported through the portal.

    A chargeback model is defines how tenants will be charged for using the cloud resources. This is typically the numbers and size of resources provisioned times the amount of time they are provisioned for. This information is available to tenant administrators through the self-service portal and well as the ability to provide cost reporting.

    Tenants are granted the ability to manage, monitor and report on the resources that they have provisioned.

    How To : Use JSON and SAP NetWeaver together

    Background

    Imagesap2[1]
    In this example, SAP is used as the backend data source and the NWGW (Netweaver Gateway) adapter to consumable from .NET client as OData format.

    Since the NWGW component is hosted on premise and our .NET client is hosted in Azure, we are consuming this data from Azure through the Service Bus relay. While transferring data from on premise to Azure over SB relay, we are facing performance issues for single user for large volumes of data as well as in relatively small data for concurrent users. So I did some POC for improving performance by consuming the OData service in JSON format.

    What I Did?

    I’ve created a simple WCF Data Service which has no underlying data source connectivity. In this service when the context is initializing, a list of text messages is generated and exposed as OData.

    Here is that simple service code:

    [Serializable]
    public class Message
    {
    public int ID { get; set; }
    public string MessageText { get; set; }
    }
    public class MessageService
    {
    List<Message> _messages = new List<Message>();
    public MessageService()
    {
    for (int i = 0; i < 100; i++)
    {
    Message msg = new Message
    {
    ID = i,
    MessageText = string.Format(“My Message No. {0}”, i)
    };
    _messages.Add(msg);

    }
    }
    public IQueryable<Message> Messages
    {
    get
    {
    return _messages.AsQueryable<Message>();
    }
    }
    }
    [ServiceBehavior(IncludeExceptionDetailInFaults = true)]
    public class WcfDataService1 : DataService
    {
    // This method is called only once to initialize service-wide policies.
    public static void InitializeService(DataServiceConfiguration config)
    {
    // TODO: set rules to indicate which entity sets
    // and service operations are visible, updatable, etc.
    // Examples:
    config.SetEntitySetAccessRule(“Messages”, EntitySetRights.AllRead);
    config.SetServiceOperationAccessRule(“*”, ServiceOperationRights.All);
    config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V3;
    }
    }
    Exposing one endpoint to Azure SB so that client can consume this service through SB endpoint. After hosting the service, I’m able to fetch data by simple OData query from browser.

    I’m also able to fetch the data in JSON format.

    After that, I create a console client application and consume the service from there.

    Sample Client Code

    class Program
    {
    static void Main(string[] args)
    {
    List lst = new List();

    for (int i = 0; i < 100; i++)
    {
    Thread person = new Thread(new ThreadStart(MyClass.JsonInvokation));
    person.Name = string.Format(“person{0}”, i);
    lst.Add(person);
    Console.WriteLine(“before start of {0}”, person.Name);
    person.Start();
    //Console.WriteLine(“{0} started”, person.Name);
    }
    Console.ReadKey();
    foreach (var item in lst)
    {
    item.Abort();
    }
    }
    }

    public class MyClass
    {
    public static void JsonInvokation()
    {
    string personName = Thread.CurrentThread.Name;
    Stopwatch watch = new Stopwatch();
    watch.Start();
    try
    {
    SimpleService.MessageService svcJson =
    new SimpleService.MessageService(new Uri
    (“https://abc.servicebus.windows.net/SimpleService /WcfDataService1”));
    svcJson.SendingRequest += svc_SendingRequest;
    svcJson.Format.UseJson();
    var jdata = svcJson.Messages.ToList();

    watch.Stop();
    Console.WriteLine(“Person: {0} – JsonTime First Call time: {1}”,
    personName, watch.ElapsedMilliseconds);

    for (int i = 1; i <= 10; i++)
    {
    watch.Reset(); watch.Start();
    jdata = svcJson.Messages.ToList();
    watch.Stop();
    Console.WriteLine(“Person: {0} – Json Call {1} time:
    {2}”, personName, 1 + i, watch.ElapsedMilliseconds);
    }

    Console.WriteLine(jdata.Count);
    }
    catch (Exception ex)
    {
    Console.WriteLine(personName + “: ” + ex.Message);
    }
    Thread.Sleep(100);
    }

    public static void AtomInvokation()
    {
    string personName = Thread.CurrentThread.Name;

    try
    {
    Stopwatch watch = new Stopwatch();
    watch.Start();
    SimpleService.MessageService svc =
    new SimpleService.MessageService(new Uri
    (“https://abc.servicebus.windows.net/SimpleService/WcfDataService1&#8221;));
    svc.SendingRequest += svc_SendingRequest;
    var data = svc.Messages.ToList();

    watch.Stop();
    Console.WriteLine(“Person: {0} – XmlTime First Call time: {1}”,
    personName, watch.ElapsedMilliseconds);

    for (int i = 1; i <= 10; i++)
    {
    watch.Reset(); watch.Start();
    data = svc.Messages.ToList();
    watch.Stop();
    Console.WriteLine(“Person: {0} – Xml Call {1} time:
    {2}”, personName, 1 + i, watch.ElapsedMilliseconds);
    }

    Console.WriteLine(data.Count);
    }
    catch (Exception ex)
    {
    Console.WriteLine(personName + “: ” + ex.Message);
    }
    Thread.Sleep(100);
    }
    }9pt;”>

     

    What I Test After That
    I tested two separate scenarios:

    Scenario I: Single user with small and large volume of data
    Measuring the data transfer time periodically in XML format and then JSON format. You might notice that first call I’ve printed separately in each screen shot as it is taking additional time to connect to SB endpoint. In the first call, the secret key authentication is happening.

    Small data set (array size 10): consume in XML format.

     

    Consume in JSON format:

     

    For small set of data, Json and XML response time over service bus relay is almost same.

    Consuming Large volume of data (Array Size 100)

     

    Here the XML message size is around 51 KB. Now I’m going to consume the same list of data (Array size 100) in JSON format.

     

    So from the above test scenario, it is very clear that JSON response time is much faster than XML response time and the reason for that is message size. In this test, when I’m getting the list of 100 records in XML format message size is 51.2 KB but JSON message size is 4.4 KB.

    Scenario II: 100 Concurrent user with large volume of data (array size 100)
    In this concurrent user load test, I’ve done any service throttling or max concurrent connection configuration.

     

    In the above screen shot, you will find some time out error that I’m getting in XML response. And it is happening due to high response time over relay. But when I execute the same test with JSON response, I found the response time is quite stable and faster than XML response and I’m not getting any time out.

     

    How Easy to Use UseJson()
    If you are using WCF Data Service 5.3 and above and VS2012 update 3, then to consume the JSON structure from the client, I have to instantiate the proxy / context with .Format.UseJson().

    Here you don’t need to load the Edmx structure separately by writing any custom code. .NET CodeGen will generate that code when you add the service reference.

     

    But if that code is not generated from your environment, then you have to write a few lines of code to load the edmx and use it as .Format.UseJson(LoadEdmx());

    Sample Code for Loading Edmx

    public static IEdmModel LoadEdmx(string srvName)
    {
    string executionPath = Directory.GetCurrentDirectory();
    DirectoryInfo di = new DirectoryInfo(executionPath).Parent;
    var parent1 = di.Parent;
    var srv = parent1.GetDirectories(“Service References\\” +
    srvName)[0].GetFiles(“service.edmx”)[0].FullName;

    XmlDocument doc = new XmlDocument();
    doc.Load(srv);
    var xmlreader = XmlReader.Create(new StringReader(doc.DocumentElement.OuterXml));

    IEdmModel edmModel = EdmxReader.Parse(xmlreader);
    return edmModel;
    }

    Microsoft BI and the new PowerQuery for Excel – How we empower users

    Introduction to Microsoft Power Query for Excel

    Microsoft Power Query for Excel enhances self-service business intelligence (BI) for Excel with an intuitive and consistent experience for discovering, combining, and refining data across a wide variety of sources including relational, structured and semi-structured, OData, Web, Hadoop, Azure Marketplace, and more. Power Query also provides you with the ability to search for public data from sources such as Wikipedia.

    With Power Query 2.10, you can share and manage queries as well as search data within your organization. Users in the enterprise can find and use these shared queries (if it is shared with them) to use the underlying data in the queries for their data analysis and reporting. For more information about how to share queries, see Share Queries.

    With Power Query, you can

    • Find and connect data across a wide variety of sources.
    • Merge and shape data sources to match your data analysis requirements or prepare it for further analysis and modeling by tools such as Power Pivot and Power View.
    • Create custom views over data.
    • Use the JSON parser to create data visualizations over Big Data and Azure HDInsight.
    • Perform data cleansing operations.
    • Import data from multiple log files.
    • Perform Online Search for data from a large collection of public data sources including Wikipedia tables, a subset of Microsoft Azure Marketplace, and a subset of Data.gov.
    • Create a query from your Facebook likes that render an Excel chart.
    • Pull data into Power Pivot from new data sources, such as XML, Facebook, and File Folders as refreshable connections.
    • With Power Query 2.10, you can share and manage queries as well as search data within your organization.

    New updates for Power Query

    The Power Query team has been busy adding a number of exciting new features to Power Query. You can download the update from this page.

    New features for Power Query include the following, please read the rest of this blog post for specific details for each.

    • New Data Sources
      • Updated “Preview” functionality of the SAP BusinessObjects BI Universe connectivity
      • Access tables and named ranges in a workbook
    • Improvements to Query Load Settings
      • Customizable Defaults for Load Settings in the Options dialog
      • Automatic suggestion to load a query to the Data Model when it goes beyond the worksheet limit
      • Preserve data in the Data Model when you modify the Load to Worksheet setting of a query that is loaded to the Data Model
    • Improvements to Query Refresh behaviors in Excel
      • Preserve Custom Columns, Conditional Formatting and other customizations of worksheet tables
      • Preserve results from a previous query refresh when a new refresh attempt fails
    • New Transformations available in the Query Editor
      • Remove bottom rows
      • Fill up
      • New statistic operations in the Insert tab
    • Other Usability Improvements
      • Ability to reorder queries in the Workbook Queries pane
      • More discoverable way to cancel a preview refresh in the Query Editor
      • Keyboard support for navigation and rename in the Steps pane
      • Ability to view and copy errors in the Filter Column dropdown menu
      • Remove items directly from the Selection Well in the Navigator
      • Send a Frown for Service errors

    Connect to SAP BusinessObjects BI Universe (Preview)

    This connectivity has been a separate Preview feature for the last month or so. In this release, we are incorporating the SAP BusinessObjects BI Universe connector Preview capabilities as part of the main Power Query download for ease of access. With Microsoft Power Query for Excel, you can easily connect to an SAP BusinessObjects BI Universe to explore and analyze data across your enterprise.

    Access tables and named ranges in an Excel workbook

    With From Excel Workbook, you can now connect to tables and named ranges in your external workbook sheets. This simplifies the process of selecting useful data from an external workbook, which used to be limited to sheets and users had to “manually” scrape the data (using Query transform operations).

     

    Customizable Defaults for Load Settings in the Options dialog

    You can override the Power Query default Load Settings in the Options dialog. This will set the default Load Settings behavior for new queries in areas where Load Settings are not exposed directly to the user, such as in Online Search results and the Navigator task pane in single-table import mode. In addition, this will set the default state for Load Settings where these settings are available including the Query Editor, and Navigator in multi-table import mode.

               

    Preserve Custom Columns, Conditional Formatting and other customizations of worksheet tables

    With this Power Query Update, Custom Columns, conditional formatting in Excel, and other customizations of worksheet tables are preserved after you refresh a query. Power Query will preserve worksheet customizations such as Data Bars, Color Scales, Icon Sets or other value-based rules across refresh operations and after query edits.

    Preserve results from a previous query refresh when a new refresh attempt fails

    After a refresh fails, Power Query will now preserve the previous query results. This allows you to work with slightly older data in the worksheet or Data Model and lets you refresh the query results after fixing the cause of errors.

    Automatic suggestion to load a query to the Data Model when it goes beyond the worksheet limits

    When you are working with large volumes of data in your workbook, you could reach the limits of Excel’s worksheet size. When this occurs, Power Query will automatically recommend to load your query results to the Data Model. The Data Model can store very large data sets.

    Preserve data in the Data Model when modifying the Load to Worksheet setting of a query that is loaded to the Data Model

    With Power Query, data and annotations on the Data Model are preserved when modifying the Load to Worksheet setting of a query. Previously, Power Query would reset the query results in both the worksheet and the Data Model when modifying either one of the two load settings.      

    Remove Bottom Rows

    A very common scenario, especially when importing data from the Web and other semi-structured sources, is having to remove the last few rows of data because the contents do not belong to the data set. For instance, it’s common to remove links to previous/next pages or comments. Previously, this was possible only by using a composition of custom formulas in Power Query. This transformation is now much easier by adding a library function called Table.RemoveLastN(), and a button for this transformation in the Home tab of the Query Editor ribbon.

     

    Fill Up

    Power Query already supports the ability to fill down values in a column to neighboring empty cells. Starting with this update, you can now fill values up within a column as well. This new transformation is available as a new library function called Table.FillUp(), and a button on the Home tab of the Query Editor ribbon.

    New Statistics operations in the Insert tab

    The Insert tab provides various ways to insert new columns in queries, based on custom formulas or by deriving values based on other columns. You can now apply Statistics operations based on values from different columns, row by row, in their table.

     

    Ability to reorder queries in the Workbook Queries pane

    With the latest Power Query update, you can move queries up or down in the Workbook Queries pane. You can right-click on a query and select Move Up or Move Down to reorder queries.

    More discoverable way of cancelling refresh of a preview in the Query Editor

    The Cancel option is now much more discoverable inside the Query Editor dialog. In addition to the Refresh dropdown menu in the ribbon, this option can now be found in the status bar at the bottom right corner of the Query Editor, next to the download status information.

      

    Keyboard support for navigation and rename in the Steps pane

    You can now use the Up/Down Arrow keys to navigate between steps in your query. Also, press the F2 key to rename the current step.

    Ability to view and copy errors in the Filter Column dropdown menu

    You can easily view and copy error details inside the Filter Column menu. This is very useful to troubleshoot errors while retrieving filter values.

    Remove items directly from the Selection Well in the Navigator

    You can remove items directly from the Selection Well instead of having to find the original item in the Navigator tree to deselect it.

     

    Send a Frown for Service errors

    We try as hard as possible to improve the quality of Power Query and all of its features. Even then, there are cases in which errors can happen. You can now send a frown directly from experiences where a service error happened, for instance, an error retrieving a Search result preview or downloading a query from the Data Catalog. This will give us enough information about the service request that failed and the client state to troubleshoot the issue.

    That’s all for this update! We hope that you enjoy these new Power Query features. Please don’t hesitate to contact us via the Power Query Forum or send us a smile/frown with any questions or feedback that you may have.

    You can also follow these links to access more resources about Power Query and Power BI:

    XI/PI: Understanding the RFC Adapter

    SAP XI provides different ways for SAP systems to communicate via SAP XI. You have three options namely IDoc Adapters, RFC Adapters and Proxies. In one of the earlier posts that spoke about your first XI scenario, we learned configuring the IDoc receiver adapter. And in the coming articles, I shall throw light on different adapters. This article specifically deals with understanding basics of RFC adapter on sender and the receiver side.
     
     Image
    SAP XI provides different ways for SAP systems to communicate via SAP XI. You have three options namely IDoc Adapters, RFC Adapters and Proxies. In one of the earlier posts that spoke about your first XI scenario, we learned configuring the IDoc receiver adapter. And in the coming articles, I shall throw light on different adapters. This article specifically deals with understanding basics of RFC adapter on sender and the receiver side.

    SAP XI RFC Sender AdapterRFC Adapter converts the incoming RFC calls to XML and XML messages to outgoing RFC calls. We can have both synchronous (sRFC) and asynchronous (tRFC) communication with SAP systems. The former works with Best Effort QoS (Quality of Service) while the later by Exactly Once (EO).

    Unlike IDoc adapter, RFC Adapter is installed on the J2EE Adapter Engine and can be monitored via Adapter Monitoring and Communication Channel Monitoring in the Runtime Workbench.

    Now let us understand the configuration needed to set up RFC communication.

    RFC Sender Adapter

    In this case, Sender SAP system requests XI Integration Engine to process RFC calls. This could either be synchronous or asynchronous.

    On the source SAP system, go to transaction SM59 and create a new RFC connection of type ‘T’ (TCP/IP Connection). On the Technical Settings tab, select “Registered Server Program” radio button and specify an arbitrary Program ID. Note that the same program ID must be specified in the configuration of the sender adapter communication channel. Also note that this program ID is case-sensitive.

    When using the RFC call in your ABAP program you should specify the RFC destination created above. For example,

    CALL FUNCTION ‘<NAME_OF_THE_RFC_FUNCTION_MODULE>’
    DESTINATION ‘<RFC_DESTINATION_NAME>’.

    Also, in case you are setting up asynchronous interface, the RFC should be called in the background. For example,

    CALL FUNCTION ‘<NAME_OF_THE_RFC_FUNCTION_MODULE>’
    IN BACKGROUND TASK
    DESTINATION ‘<RFC_DESTINATION_NAME>’.

    SAP XI RFC Receiver AdapterNow, create the relevant communication channel in the XI Integration Directory. Select the Adapter Type as RFC Sender (Please see the figure above). Specify the Application server and Gateway service of the sender SAP system. Specify the program ID. Specify exactly the same program ID that you provided while creating the RFC destination in SAP system. Note that this program ID is case-sensitive. Provide Application server details and logon credentials in the RFC metadata repository parameter. Save and activate the channel. Note that the RFC definition that you import in the Integration Repository is used only at design time. At runtime, XI loads the metadata from the sender SAP system by using the credentials provided here.

    RFC Receiver Adapter

    In this case, XI sends the data in the RFC format (after conversion from XML format by the receiver adapter) to the target system where the RFC is executed.

    Configuring the receiver adapter is even simpler. Create a communication channel in ID of type RFC Receiver (Please see the figure above on the left). Specify the RFC Client parameters like the Application server details, logon credentials etc and activate the channel.

    Testing the Connectivity

    Sometimes, especially when new SAP environments are setup, you may want to test their RFC connectivity to SAP XI before you create your actual RFC based interfaces/scenarios. There is a quick and easy way to accomplish this.

    STFC_CONNECTION InputCreate a RFC destination of type ‘T’ in the SAP system as described previously. Then, go to XI Integration Repository and import the RFC Function Module STFC_CONNECTION from the SAP system. Activate your change list.

    Configure sender and receiver communication channels in ID by specifying the relevant parameters of the SAP system as discussed previously. Remember that the Program ID in sender communication channel and RFC destination in SAP system must match (case-sensitive).

    STFC_CONNECTION OutputAccordingly, complete the remaining ID configuration objects like Sender Agreement, Receiver Determination, Interface Determination and Receiver Agreement. No Interface mapping is necessary. Activate your change list.

    Now, go back to the SAP system and execute the function module STFC_CONNECTION using transaction SE37. Specify the above RFC destination in ‘RFC target sys’ input box. You can specify any arbitrary input as REQUTEXT. If everything works fine, you should receive the same text as a response. You can also see two corresponding messages in SXMB_MONI transaction in SAP XI. This verifies the connection between SAP system and SAP XI.

    SAP Weekend : Part 2 – Using the Microsoft BizTalk Server for B2B Integration with SharePoint

    This is Part 2 of my past weekend’s activities with SharePoint and SAP Integration methods.

     

    In this post I am looking at how to use the BizTalk Adapter with SharePoint

     

    Topics

    • Abstract
    • Goal
    • Business Scenario
    • Environment
    • Document Flow
    • Integration Steps
    • .NET Support
    • Summary

     

    Abstract

    In the past few years, the whole perspective of doing business has been moved towards implementing Enterprise Resource Planning Systems for the key areas like marketing, sales and manufacturing operations. Today most of the large organizations which deal with all major world markets, heavily rely on such key areas.

    Operational Systems of any organization can be achieved from its worldwide network of marketing teams as well as from manufacturing and distribution techniques. In order to provide customers with realistic information, each of these systems need to be integrated as part of the larger enterprise.

    This ultimately results into efficient enterprise overall, providing more reliable information and better customer service. This paper addresses the integration of Biztalk Server and Enterprise Resource Planning System and the need for their integration and their role in the current E-Business scenario.

     

    Goal

    There are several key business drivers like customers and partners that need to communicate on different fronts for successful business relationship. To achieve this communication, various systems need to get integrated that lead to evaluate and develop B2B Integration Capability and E–Business strategy. This improves the quality of business information at its disposal—to improve delivery times, costs, and offer customers a higher level of overall service.

    To provide B2B capabilities, there is a need to give access to the business application data, providing partners with the ability to execute global business transactions. Facing internal integration and business–to–business (B2B) challenges on a global scale, organization needs to look for required solution.

    To integrate the worldwide marketing, manufacturing and distribution facilities based on core ERP with variety of information systems, organization needs to come up with strategic deployment of integration technology products and integration service capabilities.

     

    Business Scenario

    Now take the example of this ABC Manufacturing Company: whose success is the strength of its European-wide trading relationships. Company recognizes the need to strengthen these relationships by processing orders faster and more efficiently than ever before.

    The company needed a new platform that could integrate orders from several countries, accepting payments in multiple currencies and translating measurements according to each country’s standards. Now, the bottom line for ABC’s e-strategy was to accelerate order processing. To achieve this: the basic necessity was to eliminate the multiple collections of data and the use of invalid data.

    By using less paper, ABC would cut processing costs and speed up the information flow. Keeping this long term goal in mind, ABC Manufacturing Company can now think of integrating its four key countries into a new business-to-business (B2B) platform.

     

    Here is another example of this XYZ Marketing Company. Users visit on this company’s website to explore a variety of products for its thousands of customers all over the world. Now this company always understood that they could offer greater benefits to customers if they could more efficiently integrate their customers’ back-end systems. With such integration, customers could enjoy the advantages of highly efficient e-commerce sites, where a visitor on the Web could place an order that would flow smoothly from the website to the customer’s order entry system.

     

    Some of those back-end order entry systems are built on the latest, most sophisticated enterprise resource planning (ERP) system on the market, while others are built on legacy systems that have never been upgraded. Different customers requires information formatted in different ways, but XYZ has no elegant way to transform the information coming out of website to meet customer needs. With the traditional approach:

    For each new e-commerce customer on the site, XYZ’s staff needs to work for significant amounts of time creating a transformation application that would facilitate the exchange of information. But with better approach: XYZ needs a robust messaging solution that would provide the flexibility and agility to meet a range of customer needs quickly and effectively. Now again XYZ can think of integrating Customer Backend Systems with the help of business-to-business (B2B) platform.

     

    Environment

    Many large scale organizations maintain a centralized SAP environment as its core enterprise resource planning (ERP) system. The SAP system is used for the management and processing of all global business processes and practices. B2B integration mainly relies on the asynchronous messaging, Electronic Data Interchange (EDI) and XML document transformation mechanisms to facilitate the transformation and exchange of information between any ERP System and other applications including legacy systems.

    For business document routing, transformation, and tracking, existing SAP-XML/EDI technology road map needs XML service engine. This will allow development of complex set of mappings from and to SAP to meet internal and external XML/EDI technology and business strategy. Microsoft BizTalk Server is the best choice to handle the data interchange and mapping requirements. BizTalk Server has the most comprehensive development and management support among business-to-business platforms. Microsoft BizTalk Server and BizTalk XML Framework version 2.0 with Simple Object Access Protocol (SOAP) version 1.1 provide precisely the kind of messaging solution that is needed to facilitate integration with cost effective manner.

     

    Document Flow

    Friends, now let’s look at the actual flow of document from Source System to Customer Target System using BizTalk Server. When a document is created, it is sent to a TCP/IP-based Application Linking and Enabling (ALE) port—a BizTalk-based receive function that is used for XML conversion. Then the document passes the XML to a processing script (VBScript) that is running as a BizTalk Application Integration Component (AIC). The following figure shows how BizTalk Server acts as a hub between applications that reside in two different organizations:

    The data is serialized to the customer/vendor XML format using the Extensible Stylesheet Language Transformations (XSLT) generated from the BizTalk Mapper using a BizTalk channel. The XML document is sent using synchronous Hypertext Transfer Protocol Secure (HTTPS) or another requested transport protocol such as the Simple Mail Transfer Protocol (SMTP), as specified by the customer.

    The following figure shows steps for XML document transformation:

    The total serialized XML result is passed back to the processing script that is running as a BizTalk AIC. An XML “receipt” document then is created and submitted to another BizTalk channel that serializes the XML status document into a SAP IDOC status message. Finally, a Remote Function Call (RFC) is triggered to the SAP instance/client using a compiled C++/VB program to update the SAP IDOC status record. A complete loop of document reconciliation is achieved. If the status is not successful, an e-mail message is created and sent to one of the Support Teams that own the customer/vendor business XML/EDI transactions so that the conflict can be resolved. All of this happens instantaneously in a completely event-driven infrastructure between SAP and BizTalk.

    Integration Steps

    Let’s talk about a very popular Order Entry and tracking scenario while discussing integration hereafter. The following sections describe the high-level steps required to transmit order information from Order Processing pipeline Component into the SAP/R3 application, and to receive order status update information from the SAP/R3 application.

    The integration of AFS purchase order reception with SAP is achieved using the BizTalk Adapter for SAP (BTS-SAP). The IDOC handler is used by the BizTalk Adapter to provide the transactional support for bridging tRFC (Transactional Remote Function Calls) to MSMQ DTC (Distributed Transaction Coordinator). The IDOC handler is a COM object that processes IDOC documents sent from SAP through the Com4ABAP service, and ensures their successful arrival at the appropriate MSMQ destination. The handler supports the methods defined by the SAP tRFC protocol. When integrating purchase order reception with the SAP/R3 application, BizTalk Server (BTS) provides the transformation and messaging functionality, and the BizTalk Adapter for SAP provides the transport and routing functionality.

    The following two sequential steps indicate how the whole integration takes place:

    • Purchase order reception integration
    • Order Status Update Integration

    Purchase Order Reception Integration

    1. Suppose a new pipeline component is added to the Order Processing pipeline. This component creates an XML document that is equivalent to the OrderForm object that is passed through the pipeline. This XML purchase order is in Commerce Server Order XML v1.0 format, and once created, is sent to a special Microsoft Message Queue (MSMQ) queue created specifically for this purpose.Writing the order from the pipeline to MSMQ:>

      The first step in sending order data to the SAP/R3 application involves building a new pipeline component to run within the Order Processing pipeline. This component must perform the following two tasks:

      A] Make an XML-formatted copy of the OrderForm object that is passing through the order processing pipeline. The GenerateXMLForDictionaryUsingSchema method of the DictionaryXMLTransforms object is used to create the copy.

      Private Function IPipelineComponent_Execute(ByVal objOrderForm As Object, _
          ByVal objContext As Object, ByVal lFlags As Long) As Long
      
      On Error GoTo ERROR_Execute
      
      Dim oXMLTransforms As Object
      Dim oXMLSchema As Object
      Dim oOrderFormXML As Object
      
      ' Return 1 for Success.
      IPipelineComponent_Execute = 1
      
      ' Create a DictionaryXMLTransforms object.
      Set oXMLTransforms = CreateObject("Commerce.DictionaryXMLTransforms")
      
      ' Create a PO schema object.
      Set oXMLSchema = oXMLTransforms.GetXMLFromFile(sSchemaLocation)
      
      ' Create an XML version of the order form.
      Set oOrderFormXML = oXMLTransforms.GenerateXMLForDictionaryUsingSchema_
          (objOrderForm, oXMLSchema)
      
      WritePO2MSMQ sQueueName, oOrderFormXML.xml, PO_TO_ERP_QUEUE_LABEL, _
          sBTSServerName, AFS_PO_MAXTIMETOREACHQUEUE
      
      Exit Function
      
      ERROR_Execute:
      App.LogEvent "QueuePO.CQueuePO -> Execute Error: " & _
      vbCrLf & Err.Description, vbLogEventTypeError
      
      ' Set warning level.
      IPipelineComponent_Execute = 2
      Resume Next
      
      End Function

      B] Send the newly created XML order document to the MSMQ queue defined for this purpose.

      Option Explicit
      
      ' MSMQ constants.
      
      ' Access modes.
      Const MQ_RECEIVE_ACCESS = 1
      Const MQ_SEND_ACCESS = 2
      Const MQ_PEEK_ACCESS = 32
      
      ' Sharing modes. Const MQ_DENY_NONE = 0
      Const MQ_DENY_RECEIVE_SHARE = 1
      
      ' Transaction options. Const MQ_NO_TRANSACTION = 0
      Const MQ_MTS_TRANSACTION = 1
      Const MQ_XA_TRANSACTION = 2
      Const MQ_SINGLE_MESSAGE = 3
      
      ' Error messages.
      Const MQ_ERROR_QUEUE_NOT_EXIST = -1072824317
      
      ' MQ Message ACKNOWLEDGEMENT.
      Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
      Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
      Const DEFAULT_MAX_TIME_TO_REACH_QUEUE = 20
      ' MQ Message ACKNOWLEDGEMENT.
      Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
      Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
      
      Function WritePO2MSMQ(sQueueName As String, sMsgBody As String, _
          sMsgLabel As String, sServerName As String, _
          Optional MaxTimeToReachQueue As Variant) As Long
      
      Dim lMaxTime As Long
      
      If IsMissing(MaxTimeToReachQueue) Then
      lMaxTime = DEFAULT_MAX_TIME_TO_REACH_QUEUE
      Else
      lMaxTime = MaxTimeToReachQueue
      End If
      
      Dim objQueueInfo As MSMQ.MSMQQueueInfo
      Dim objQueue As MSMQ.MSMQQueue, objAdminQueue As MSMQ.MSMQQueue
      Dim objQueueMsg As MSMQ.MSMQMessage
      
      On Error GoTo MSMQ_Error
      
      Set objQueueInfo = New MSMQ.MSMQQueueInfo
      objQueueInfo.FormatName = "DIRECT=OS:" & sServerName & "\PRIVATE$\" & sQueueName
      
      Set objQueue = objQueueInfo.Open(MQ_SEND_ACCESS, MQ_DENY_NONE)
      
      Set objQueueMsg = New MSMQ.MSMQMessage
      
      objQueueMsg.Label = sMsgLabel ' Set the message label property
      objQueueMsg.Body = sMsgBody ' Set the message body property
      objQueueMsg.Ack = MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE
      objQueueMsg.MaxTimeToReachQueue = lMaxTime
      
      objQueueMsg.send objQueue, MQ_SINGLE_MESSAGE
      
      objQueue.Close
      
      On Error Resume Next
      Set objQueueMsg = Nothing
      Set objQueue = Nothing
      Set objQueueInfo = Nothing
      
      Exit Function
      
      MSMQ_Error:
      App.LogEvent "Error in WritePO2MSMQ: " & Error
      Resume Next
      
      End Function
      
    2. A BTS MSMQ receive function picks up the document from the MSMQ queue and sends it to a BTS channel that has been configured for this purpose. Receiving the XML order from MSMQ: The second step in sending order data to the SAP/R3 application involves BTS receiving the order data from the MSMQ queue into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the MSMQ queue to which the XML order was sent in the previous step. This receive function forwards the XML message to the configured BTS channel for transformation.
    3. The third step in sending order data to the SAP/R3 application involves BTS transforming the order data from Commerce Server Order XML v1.0 format into ORDERS01 IDOC format. A BTS channel must be configured to perform this transformation. After the transformation is complete, the BTS channel sends the resulting ORDERS01 IDOC message to the corresponding BTS messaging port. The BTS messaging port is configured to send the transformed message to an MSMQ queue called the 840 Queue. Once the message is placed in this queue, the BizTalk Adapter for SAP is responsible for further processing. 
    4. BizTalk Adapter for SAP sends the ORDERS01document to the DCOM Connector (Get more information on DCOM Connector from www.sap.com/bapi), which writes the order to the SAP/R3 application. The DCOM Connector is an SAP software product that provides a mechanism to send data to, and receive data from, an SAP system. When an IDOC message is placed in the 840 Queue, the DOM Connector retrieves the message and sends it to SAP for processing. Although this processing is in the domain of the BizTalk Adapter for SAP, the steps involved are reviewed here as background information:
      • Determine the version of the IDOC schema in use and generate a BizTalk Server document specification.
      • Create a routing key from the contents of the Control Record of the IDOC schema.
      • Request a SAP Destination from the Manager Data Store given the constructed routing key.
      • Submit the IDOC message to the SAP System using the DCOM Connector 4.6D Submit functionality.

    Order Status Update Integration

    Order status update integration can be achieved by providing a mechanism for sending information about updates made within the SAP/R3 application back to the Commerce Server order system.

    The following sequence of steps describes such a mechanism:

    1. BizTalk Adapter for SAP processing:
      After a user has updated a purchase order using the SAP client, and the IDOC has been submitted to the appropriate tRFC port, the BizTalk Adapter for SAP uses the DCOM connector to send the resulting information to the 840 Queue, packaged as an ORDERS01 IDOC message. The 840 Queue is an MSMQ queue into which the BizTalk Adapter for SAP places IDOC messages so that they can be retrieved and processed by interested parties. This process is within the domain of the BizTalk Adapter for SAP, and is used by this solution to achieve the order update integration.
    2. Receiving the ORDERS01 IDOC message from MSMQ:
      The second step in updating order status from the SAP/R3 application involves BTS receiving ORDERS01 IDOC message from the MSMQ queue (840 Queue) into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the 840 Queue into which the XML order status message was placed. This receive function must be configured to forward the XML message to the configured BTS channel for transformation.
    3. Transforming the order update from IDOC format:
      Using a BTS MSMQ receive function, the document is retrieved and passed to a BTS transformation channel. The BTS channel transforms the ORDERS01 IDOC message into Commerce Server Order XML v1.0 format, and then forwards it to the corresponding BTS messaging port. You must configure a BTS channel to perform this transformation.The following BizTalk Server (BTS) map demonstrates in the prototyping of this solution for transforming an SAP ORDERS01 IDOC message into an XML document in Commerce Server Order XML v1.0 format. It allows a change to an order in the SAP/R3 application to be reflected in the Commerce Server orders database.

      This map used in the prototype only maps the order ID, demonstrating how the order in the SAP/R3 application can be synchronized with the order in the Commerce Server orders database. The mapping of other fields is specific to a particular implementation, and was not done for the prototype.

    < xsl:stylesheet xmlns:xsl='http://www.w3.org/1999/XSL/Transform' 
    xmlns:msxsl='urn:schemas-microsoft-com:xslt' xmlns:var='urn:var' 
    xmlns:user='urn:user' exclude-result-prefixes='msxsl var user' 
    version='1.0'>
    < xsl:output method='xml' omit-xml-declaration='yes' />
    < xsl:template match='/'>
    < xsl:apply-templates select='ORDERS01'/>
    < /xsl:template>
    < xsl:template match='ORDERS01'>
    < orderform>
    
    'Connection from source node "BELNR" to destination node "OrderID"
    
    < xsl:if test='E2EDK02/@BELNR'>
    < xsl:attribute name='OrderID'>
    ; < xsl:value-of select='E2EDK02/@BELNR'/>
    < /xsl:attribute>
    < /xsl:if>
    < /orderform>
    < /xsl:template>
    < /xsl:stylesheet>

    The BTS message port posts the transformed order update document to the configured ASP page for further processing. The configured ASP page retrieves the message posted to it and uses the Commerce Server OrderGroupManager and OrderGroup objects to update the order status information in the Commerce Server orders database.

  • Updating the Commerce Server order system:
    The fourth step in updating order status from the SAP/R3 application involves updating the Commerce Server order system to reflect the change in status. This is accomplished by adding the page _OrderStatusUpdate.asp to the AFS Solution Site and configuring the BTS messaging port to post the transformed XML document to that page. The update is performed using the Commerce Server OrderGroupManager and OrderGroup objects.
  •  

    The routine ProcessOrderStatus is the primary routine in the page. It uses the DOM and XPath to extract enough information to find the appropriate order using the OrderGroupManager object. Once the correct order is located, it is loaded into an OrderGroup object so that any of the entries in the OrderGroup object can be updated as needed.

    The following code implements page _OrderStatusUpdate.asp:

    < %@ Language="VBScript" %>
    
    < % 
    const TEMPORARY_FOLDER = 2
    
    call Main()
    
    Sub Main()
    call ProcessOrderStatus( ParseRequestForm() )
    End Sub
    
    Sub ProcessOrderStatus(sDocument)
    
    Dim oOrderGroupMgr 
    Dim oOrderGroup 
    Dim rs
    Dim sPONum
    Dim oAttr 
    Dim vResult
    Dim vTracking 
    Dim oXML
    Dim dictConfig
    Dim oElement
    
    Set oOrderGroupMgr = Server.CreateObject("CS_Req.OrderGroupManager")
    Set oOrderGroup = Server.CreateObject("CS_Req.OrderGroup")
    
    Set oXML = Server.CreateObject("MSXML.DOMDocument")
    oXML.async = False
    
    If oXML.loadXML (sDocument) Then
    
    ' Get the orderform element.
    Set oElement = oXML.selectSingleNode("/orderform")
    
    ' Get the poNum.
    sPONum = oElement.getAttribute("OrderID")
    
    Set dictConfig = Application("MSCSAppConfig").GetOptionsDictionary("")
    
    ' Use ordergroupmgr to find the order by OrderID.
    oOrderGroupMgr.Initialize (dictConfig.s_CatalogConnectionString)
    Set rs = oOrderGroupMgr.Find(Array("order_requisition_number='" sPONum & "'"), _
        Array(""), Array(""))
    
    If rs.EOF And rs.BOF Then
    'Create a new one. - Not implemented in this version.
    Else
    ' Edit the current one.
    oOrderGroup.Initialize dictConfig.s_CatalogConnectionString, rs("User_ID")
    
    ' Load the found order.
    oOrderGroup.LoadOrder rs("ordergroup_id")
    
    ' For the purposes of prototype, we only update the status
    oOrderGroup.Value.order_status_code = 2 ' 2 = Saved order
    
    ' Save it
    vResult = oOrderGroup.SaveAsOrder(vTracking)
    
    End If
    Else
    WriteError "Unable to load received XML into DOM."
    End If
    
    End Sub Function ParseRequestForm()
    
    Dim PostedDocument
    Dim ContentType
    Dim CharSet
    Dim EntityBody
    Dim Stream
    Dim StartPos
    Dim EndPos
    
    ContentType = Request.ServerVariables( "CONTENT_TYPE" )
    
    ' Determine request entity body character set (default to us-ascii).
    CharSet = "us-ascii"
    StartPos = InStr( 1, ContentType, "CharSet=""", 1)
    If (StartPos > 0 ) then
    StartPos = StartPos + Len("CharSet=""")
    EndPos = InStr( StartPos, ContentType, """",1 )
    CharSet = Mid (ContentType, StartPos, EndPos - StartPos )
    End If
    
    ' Check for multipart MIME message.
    PostedDocument = ""
    
    if ( ContentType = "" or Request.TotalBytes = 0) then
    
    ' Content-Type is required as well as an entity body.
    Response.Status = "406 Not Acceptable"
    Response.Write "Content-type or Entity body is missing" & VbCrlf
    Response.Write "Message headers follow below:" & VbCrlf
    Response.Write Request.ServerVariables("ALL_RAW") & VbCrlf
    Response.End
    Else
    If ( InStr( 1,ContentType,"multipart/" ) >

    .NET Support

    This Multi-Tier Application Environment can be implemented successfully with the help of Web portal which utilizes the Microsoft .NET Enterprise Server model. The Microsoft BizTalk Server Toolkit for Microsoft .NET provides the ability to leverage the power of XML Web services and Visual Studio .NET to build dynamic, transaction-based, fault-tolerant systems with full access to existing applications.

    Summary

    Microsoft BizTalk Server can help organizations quickly establish and manage Internet relationships with other organizations. It makes it possible for them to automate document interchange with any other organization, regardless of the conversion requirements and data formats used. This provides a cost-effective approach for integrating business processes across large Enterprises Resource Planning Systems. Integration process designed to facilitate collaborative e-commerce business processes. The process includes a document interchange engine, a business process execution engine, and a set of business document and server management tools. In addition, a business document editor and mapper tools are provided for managing trading partner relationships, administering server clusters, and tracking transactions.

    References

    All my Web Parts and Apps are now making use of Knockout.JS !! Template also available at very low price!!

    After completing the development of my latest Web Part, the “List Search” Web Part I decided to update all my Web Parts and Apps to using Knockout.JS, starting with the “List Search” Web Part.

    This topic came up when we I looked at some of my older products that includes generic list and library web parts, that would display few common fields like ID, Title, Description, File Url etc. Prior to this request we solved similar issues with OOB list and library web parts with custom XSLT, by creating Visual Studio web part for branding purposes only, or by using Imtech content query web part( which is XSLT solution by design).

    At the end, clients hated XSLT solutions and I hated to create new web part for every new list or library. That’s where Knockout popped. Why don’t we use Knockout for templates instead XSLT.

    I’ll assume that whoever reads this article knows about creating a web part for SharePoint, SharePoint module, java script and html and I will not go into details.

    Background

    A bit about Knockout

    From Knockout web site: “Knockout is a JavaScript library that helps you to create rich, responsive display and editor user interfaces with a clean underlying data model. “

    From Wikipedia:

    Knockout is a standalone JavaScript implementation of the Model-View-ViewModel pattern with templates. The underlying principles are therefore:

    • a clear separation between domain data, view components and data to be displayed
    • the presence of a clearly defined layer of specialized code to manage the relationships between the view components

    Knockout includes the following features:

    • Declarative bindings
    • Automatic UI refresh (when the data model’s state changes, the UI updates automatically)
    • Dependency tracking
    • Templating (using a native template engine although other templating engines can be used, such as jquery.tmpl)

    So what’s the deal?

    First you have your view model:

     var myViewModel = {
         personName: 'Bob',
         personAge: 123
    };

    Then you have a view:

    The name is <span data-bind="text:personName"></span>

    At the end just bind your view to model

     ko.applyBindings(myViewModel);

    We’ll talk about model later.

    Using the code

    Proof of concept

    I’ve created an html mock of our web part. This is useful, because we can prepare java scripts, css files, models and views in advance and test it without SharePoint and visual studio.

    You can download proof of concept as separate download from the link above.

    References

    There would be only two file references.

    One is knockout library itself

    <script type='text/javascript' src="http://knockoutjs.com/downloads/knockout-3.0.0.js"></script>

    and the other is css file I’ve added to this project

    <link href="css/controls.css" rel="stylesheet" type="text/css" />

    Model 

    I’ve designed model as Item class. Here it is:

    // Item class definition
    var Item = function (id, title, datecreated,url,description,thumbnail) {
       this.id = id;
       this.title = title;
       this.datecreated = datecreated;
       this.url=url;
       this.description=description;
       this.thumbnail=thumbnail;
    }

    It’s called item and it has 6 properties:

    1. id – ID of the item
    2. title – Title of the item
    3. datecreated – Creation date of the item
    4. url – Url of the item
    5. description – Description of the item
    6. thumbnail – Thumbnail of the item

     

    View model

    Here is the view model

    function viewModel1 (){
        var self = this;
        self.items =  [  
         new Item(2, 'News1 title','21.10.2013','javascript:OpenDialog(2);'
                   ,'Description News 1','img/pic1.jpg'), 
        new Item(1, 'News 2 title','21.02.2013','javascript:OpenDialog(1);',
                   'Description News 2','img/pic2.jpg')
    }

    View model has property items, which in fact is collection of Item objects. For mocking purposes we’ve added two Item objects in this collection (News 1 and News 2);

     

    View

    Here is the view:

    <div class="glwp glwp-central" id="k1">
      <div class="glwpLine"></div>
      <h5><img src="PublishingImages/siteIcon.png" 
              width="28" height="28" align="absmiddle" />
          News</h5>
      <div class="glwpLineGrey"></div>
        <ul data-bind="foreach:items">
          <li>
           <div class="glwpDate"><span data-bind="text: datecreated" ></span>
           <img class="glwpImage" data-bind="attr: { src: thumbnail }" />         
           </div>
           <div class="glwpText glwpText-central" >
            <a data-bind="attr: { href: url, title: title }" style="min-height:70px;">
             <span class="glwpTextTTL" data-bind="text:title"></span><br />
             <span data-bind="text: description"></span>
            </a>
           </div>
           <div class="glwpSep"></div>
          </li>
        </ul>
    </div>

    What we have here:

    It’s pretty simple. We haveunordered list bound to our model. One

    • element would be created for every item of our items collection (data-bind=”foreach: items”).

     

     

    Property binding: 

    •  datecreated">< /span> – This is the simplest data binding. It would write datecreated property of Item object to text of span element (like: <span>11/11/2013</span>)
    • <img class="glwpImage" data-bind="attr: { src: thumbnail }" />. This is a bit more complicated binding. It would take thumbnail property of item object and write it to src attribute of img element.
    • 70px;">. It would take url property and write it as href attribute of the a element, and title property as title attribute.
    • <span class="glwpTextTTL" data-bind="text:title"></span>. Title property would be written as text of span element
    • <span data-bind="text: description"></span>. Description property would be written as text of span element

    So anyone with little knowledge of html and css can customize this template anyway (s)he likes, as long as (s)he provides required properties.

     

    Binding

    ko.applyBindings(viewModel1,document.getElementById('k1'));

    Note second parameter in applyBindings method. It says document.getElementById('k1'). Same id is on the first div in our view (k1″>). This is helpful if you want to have more than one view model in one page. It tells knockout to bind this specific model (viewModel1) to specific template on our page (k1).

     

    What we have from this? We are going to create web part from this code and one of the web part features is that you can put same web part several times on the same page. So it would be possible to put one web part in SharePoint page to display news and one web part to display projects or documents. And they will coexist together.

    If you look at the source you will notice that we have 2 view models (viewModel1 and viewModel2) and two templates (k1 and k2), and two bindings of course. One binding is for news (with images and description) and one binding is for files (no images, and no descriptions). Templates are slightly different.

    Final result

    Here is the final result

    SharePoint Part

    As I said I will assume that you have some experience with SharePoint development so I will not explain how to create the project and add project items. Project type is standard Visual Studio 2010 SharePoint Empty Project template.

    SharePoint part consists of following items:

    • Web part item – KnockoutWp. Standard SharePoint Visual Web part project Item
    • Assets module. SharePoint module project item. We are going to use it for deploying of images and css files (0.png – empty container for images and controls.css – css file for our projects).
    • Layouts mapped folder. We’ll put here editor page for template.

    And here is the solution explorer for project:

    Assets

    We are going to deploy 2 files:

    • 0.png – 1×1 pixel transparent image aka placeholder
    • Controls.css – css file for our template

    Both of these items are going to be deployed to Style Library of the SharePoint site collection, so content editors may change it later without need of solution redeployment.

    Here is the elements.xml file:

    So our assets will end to http://oursitecollectionurl/Style Library/wp folder.

    KnockoutWp

    This is Visual Studio 2010 Visual Web part.

    It is consisted of 4 items:

    • KnockoutWp.cs – web part class
    • KnockoutWpUserControl – User control of our web part
    • KnockoutWp.webpart – web part xml file
    • Elements.xml – manifest file

    Properties

    Web part has following properties:

    • ListUrl (string, required) – url of the list we are displaying.
    • TitleField (string, optional) – display name of the field that would be displayed as Title. If it’s blank Title field would be used.
    • DateField (string, optional) – display name of the field that would be displayed as date. If it’s blank Created field would be used.
    • DescriptionField (string, optional) – display name of the field that would be displayed as Description. If it’s blank it would be omitted.
    • ImageField (string, optional) – display name of the field that would be displayed as Thumbnail picture. If it’s blank it would be omitted.
    • NoOfItems (int) – how many items from the list would be displayed
    • ItemTemplate (string) – html template of the web part. Defines the look of our web part.
    • WpPosition (enum) – Used for a three column layouts. Web part has styles for three zones: right, central and left. Difference is in width, padding and margin. Everything is set in css so you can accommodate it to your environment.

    On picture below you can see mapping between Field properties of web part and list item fields.

     

    EditorPart

    I’ve added one more thing to this web part it’s EditorPart class GenericListPartEditorPart. I’m not going into deep with editor parts, but here is quick info. When you create public property for a web part it is automatically displayed in web part edit panel.

    And it is great concept when you need simple properties as strings, numbers and short lists. If you want more complicated scenario (as we want here for our web part) it’s not enough.

    What I wanted here is template editor. It could be reasonably large so idea was to have a button in web part edit panel that would open large dialog window with editor. User would work with our template, click Apply and change ItemTemplate web part property.

    Template editor KnockoutWpUserControl

    This is user control created by Visual Studio, when we added Visual web part project item to the project. It consists of markup ascx file and code behind .ascx.cs file. We will put our markup and our c# code here.

    Markup

    Here is the complete markup:

    <script type='text/javascript' src="http://knockoutjs.com/downloads/knockout-3.0.0.js">
    </script>
    <style type="text/css">  @import url("/Style
    Library/wp/controls.css");  </style>  
    <div class="glwp glwp-<%=PositionClass %>" id="k<%=WpId %>">
      <div class="glwpLine"></div>      
      <h5><img src="<%=Icon %>" width="28" 
        height="28" align="absmiddle"><%=Title %></h5>
        <div class="glwpLineGrey"></div>      
      <asp:Literal ID="LitLayout" runat="server"></asp:Literal>
    </div>  
    
    <script type="text/javascript">    
      function OpenDialog(Url) {
        var options = SP.UI.$create_DialogOptions();        
        options.resizable = 1;        
        options.scroll = 1;        
        options.url = Url;
        SP.UI.ModalDialog.showModalDialog(options);    
    }         
    // Item class         
      var Item = function (id, title, datecreated,url,description,thumbnail) {            
         this.id = id;            
         this.title = title;
         this.datecreated = datecreated;
         this.url=url;
         this.description=description;
         this.thumbnail=thumbnail;
      }         
     //ViewModel goes here (It's created on server)        
     runat="server" ID="LitItems"></asp:Literal>
     
    //Function that opens Template editor. Used only in edit mode of web part       
     function portal_openTemplateEditor(wpid) {       
      var val="";              
      var options = SP.UI.$create_DialogOptions();              
      options.width = 600;             
      options.height = 500;                
      options.url = "/_layouts/KnockoutTemplate/TemplateEditor.aspx?c="+wpid;//"";
      options.dialogReturnValueCallback =
               Function.createDelegate(null,portal_openTemplateEditorClosedCallback);
      SP.UI.ModalDialog.showModalDialog(options);
    }
    </script>

    First Section, of the markup (picture below) has script (knockout, on the remote server) and style references (controls.css in local Document library). Below is html markup that defines the container of the web part (top and bottom borders, width, icon and title). Markup is not the cleanest because I was little lazy and left some public properties in it. Note< %=PositionClass%>, <%=WpId%> and so on.

    There are all public properties of the user control and they are used for presentation:

    • PositionClass – depending on WpPosition web part property (right, central or left) adds appropriate css class to markup and that way defines width, padding and margin of web part WpId is guid of the web part. It is used to uniquely identify the web part, because we can put several web parts of the same type and everything would crush without this identificator.
    • Icon – is a url to icon that would be displayed on web part. Web part property Title Icon Image URL is used here (this is OOB property)
    • Title –title text of the web part. Text that was entered in the title area of the web part. Web part property Title is used here (this is OOB property)

    Last interesting thing here is Literal control LitLayout. This control would hold our ItemTemplate property (html template of our web part).

    Second section, is a java script function that opens list item in a dialog window. It is used when underlying list is not document library.

    Third section consists of knockout view model (java script). Item class definition is self-explanatory (defines 6 properties only). The rest of the model is created on the server side so now there is only LitItems Literal control there.

    Fourth section is just a java script function that is used when editing web part properties. This function opens template editor in dialog window.

    Code

    Properties:

    • Properties from web part
      • Icon – url to the icon
      • Title – title of the web part
      • ListUrl – url to the list
      • TitleField – Title field in the list
      • DateField – Date field in the list
      • ImageField – Image field in the list
      • DescriptionField – Description field in the list
      • NoOfItems – number of items to return
      • Position – position of the web part (right, left or central)
      • ItemTemplate – html template of the web part
      • WpId – guid id of the web part ·
    • UC’s properties
      • PositionClass – css class based on position
      • ColumnMap – dictionary that holds internal names of the list item fields.

    Methods: File has only one method Page_Load. Code is executing with elevated privileges.

    In that method we:

    1. Resolve list by the supplied URL (ListUrl property) SPList annList = annWeb.GetList(ListUrl);
    2. Get internal names of the list columns by their Display names SpHelper.GetFieldsInternals(annWeb, annList.Title, TitleField, DateField, DescriptionField, ImageField, columnMap );
    3. Create CAML Query SpHelper.GetGenericQuery(annList, q, NoOfItems);
    4. Execute it
    5. Iterate over SPListItemCollection (coll) and create required JavaScript
    Helper class

    SPHelper is helper class and you can find it in Helpers directory.

    It has 3 responsibilities:

    1. To retrieve List Columns Internal names based on supplied List Columns display names (WP properties – TitleField – Title field, DateField, ImageField , DescriptionField ) – GetFieldsInternals method
    2. To create Caml query for retrieving list items – GetGenericQuery method
    3. To retrieve values from SharePoint columns based on their types – GetFieldValue method

     

    SAP Weekend : Part 1 – ERPConnect Services for SharePoint 2010 (ECS)

    This weekend was spent completing my new “List Search Web Part” and also 2 Free Web Parts that is included in the “List Web Part Pack” – More about this in my future blog.

    erp256-bc0e84ce

    In between the “SAP Bug” bit me again and I decided to write a  blog post series on the various adapters I have used in SharePoint and SAP Integration Projects and to give you a basic “run down” of how and with which technologies each adapter connects the 2 systems with.

    ERPConnect was 1st on the list. ….

     

    Yes, I can hear the grumblings of those of us who have worked with SAP and SharePoint  Integration and the ERPConnect adapter before 🙂

    For starters, you need to have a SAP Developer Key to be allowed to use the SAP web service wizard, and also have the required SAP authorizations. In other cases it may not be allowed by IT operations to make any modification to the SAP environment, even if it’s limited to the full-automatic generation and activation of the BAPI webservice(s).

    Another reason from a system architecture viewpoint, is that the single BAPI and/or RFC calls may be of too low granularity. You actually want to perform a ‘business transaction’, consisting of multiple method invocations which must be treated as a Logical Unit of Work (LUW). SAP has introduced the concept of SAP Enterprise Services for this, and has delivered a first set of them. This is by far not complete yet, and SAP will augment it the coming years.

    SharePoint 2010 provides developer with the capability to integrate external data sources like SAP business data via the Business Connectivity Services (BCS) into the SharePoint system. The concept of BCS is based on entities and associated stereotyped operations. This perfectly suits for flat and simple structured data sets like SAP tables.

    Another and way more flexible option to use SAP data in SharePoint are the ERPConnect Services for SharePoint 2010 (ECS). The product suite consists of three product components: ERPConnect Services runtime, the BCS Connector application and the Xtract PPS for PerformancePoint Services.

    The runtime is providing a Service Application that integrates itself with the new service architecture of SharePoint 2010. The runtime offers a secure middle-tier layer to integrate different kind of SAP objects in your SharePoint applications, like tables and function modules.

    The BCS Connector application allows developers to create BDC models for the BCS Services, without programming knowledge. You may export the BDC models created by the BCS Connector to Visual Studio 2010 for further customizing.

     

    The Xtract PPS component offers a SAP data source provider for the PerformancePoint Services of SharePoint 2010. T

    his article gives you an overview of the ERPConnect Services runtime and shows how you can create and incorporate business data from SAP in different SharePoint application types, like Web Parts, Application Pages or Silverlight modules.

    This article does not introduce the other components.

    Background

    This section will give you a short explanation and background of SAP objects that can be used in ERPConnect Services. The most important objects are SAP tables and function modules. A function module is basically similar to a normal procedure in conventional programming languages. Function modules are written in ABAP, the SAP programming language, and are accessible from any other programs within a SAP system. They accept import and export parameters as well as other kind of special parameters.

     

    In addition, BAPIs (Business-API) are special function modules that are organized within the SAP Business Object Repository. In order to use function modules with the runtime they must be marked as Remote (RFC). SAP table data can also be retrieved. Tables in SAP are basically relational database tables. Others SAP objects like BW Cubes or SAP Queries can be accessed via the XtractQL query language (see below).

     

    Installation & Configuration

    Installing ERPConnect Services on a SharePoint 2010 server is done by an installer and is straight forward. The SharePoint Administration Service must run on the local server (see Windows Services).

    For more information see product documentation. After the installation has been successfully processed navigate to the Service Applications screen within the central administration (CA) of SharePoint:

     

    Before creating your first Service Application a Secure Store must be created, where ERPConnect Services will save SAP user credentials. In the settings page for the “Secure Store Service” create a new Target Application and name the application “ERPConnect Services”. Click on the button “Next” to define the store fields as follows:

     

    Finish the creation process by clicking on “Next” and define application administrators. Then, mark the application, click “Set Credentials” and enter the SAP user credentials:

     

    Let’s go on and create a new ERPConnect Service Application!

    Click the “ERPConnect Service Application” link in the “New” menu of the Service Applications page (see also first screenshot above). This opens a dialog to define the name of the service application, the SAP connection data and the IIS application pool:

     

    Click “Create” after entering all data and you will see the following entries in the Service Applications screen:

     

    That’s it! You are now done setting up your first ERPConnect Service Application.

    Development

    The runtime functionality covers different programming demands such as generically retrievable interface functions. The service applications are managed by the Central Administration of SharePoint. The following service and function areas are provided:

    1. Executing and retrieving data directly from SAP tables
    2. Executing SAP function modules / BAPIs
    3. Executing XtractQL query statements

    The next sections shows how to use these service and function areas and access different SAP objects from within your custom SharePoint applications using the ERPConnect Services. The runtime can be used in applications within the SharePoint context like Web Parts or Application Pages.

    In order to do so, you need to reference the assembly i in the project. Before you can access data from the SAP system you must create an instance of the ERPConnectServiceClient class. This is the gate to all SAP objects and the generic API of the runtime in overall. In the SharePoint context there are two options to create a client object instance:

    // Option #1
    ERPConnectServiceClient client = new ERPConnectServiceClient();
    
    // Option #2
    ERPConnectServiceApplicationProxy proxy = SPServiceContext.Current.GetDefaultProxy(
       typeof(ERPConnectServiceApplicationProxy)) as ERPConnectServiceApplicationProxy;
    ERPConnectServiceClient client = proxy.GetClient();

    For more details on using ECS in Silverlight or desktop applications see the specific sections below.

    Querying Tables

    Querying and retrieving table data is a common task for developers. The runtime allows retrieving data directly from SAP tables. The ERPConnectServiceClient class provides a method called ExecuteTableQuery with two overrides which query SAP tables in a simple way.

    The method also supports a way to pass miscellaneous parameters like row count and skip, custom function, where clause definition and a returning field list. These parameters can be defined by using the ExecuteTableQuerySettings class instance.

    DataTable dt = client.ExecuteTableQuery("T001");
    
    …
        
    ExecuteTableQuerySettings settings = new ExecuteTableQuerySettings {
      RowCount = 100,
      WhereClause = "ORT01 = 'Paris' AND LAND1 = 'FR'",
      Fields = new ERPCollection<string> { "BUKRS", "BUTXT", "ORT01", "LAND1" }
    };
    
    DataTable dt = client.ExecuteTableQuery("T001", settings);
    
    …
    
    // Sample 2
    DataTable dt = client.ExecuteTableQuery("MAKT",
                new ExecuteTableQuerySettings {
                    RowCount = 10,
                    WhereClause = "MATNR = '60-100C'",
                    OrderClause = "SPRAS DESC"
                });

    The first query reads all records from the SAP table T001 where the fields ORT01 equals Paris and LAND1 equals FR (France). The query returns the top 100 records and the result set contains only the fields BUKRS, BUTXT, ORT01 and LAND1.

    The second query returns the top ten records of the SAP table MAKT, where the field MATNR equals the material number 60-100C. The result set is ordered by the field SPRAS.

    Executing Function Modules

    In addition to query SAP tables the runtime API executes SAP function modules (BAPIs). Function modules must be marked as remote-enabled modules (RFC) within SAP.

    The ERPConnectServiceClient class provides a method called CreateFunction to create a structure of metadata for the function module. The method returns an instance of the data structure ERPFunction. This object instance contains all parameters types (import, export, changing and tables) that can be used with function modules.

    In the sample below we call the function SD_RFC_CUSTOMER_GET and pass a name pattern (T*) for the export parameter with name NAME1. Then we call the Execute method on the ERPFunction instance. Once the method has been executed the data structure is updated. The function returns all customers in the table CUSTOMER_T.

    ERPFunction function = client.CreateFunction("SD_RFC_CUSTOMER_GET");
    function.Exports["NAME1"].ParamValue = "T*";
    function.Execute();
    
    foreach(ERPStructure row in function.Tables["CUSTOMER_T"])
      Console.WriteLine(row["NAME1"] + ", " + row["ORT01"]);

    The following code shows an additional sample. Before we can execute this function module we need to define a table with HR data as input parameter.

    The parameters you need and what values the function module is returning dependents on the implementation of the function module

    ERPFunction function = client.CreateFunction("BAPI_CATIMESHEETMGR_INSERT");
    function.Exports["PROFILE"].ParamValue = "TEST";
    function.Exports["TESTRUN"].ParamValue = "X";
    
    ERPTable records = function.Tables["CATSRECORDS_IN"];
    ERPStructure r1 = records.AddRow();
    r1["EMPLOYEENUMBER"] = "100096";
    r1["WORKDATE"] = "20110704";
    r1["ABS_ATT_TYPE"] = "0001";
    r1["CATSHOURS"] = (decimal)8.0;
    r1["UNIT"] = "H";
    
    function.Execute();
    
    ERPTable ret = function.Tables["RETURN"]; 
    
    foreach(var i in ret)
      Console.WriteLine("{0} - {1}", i["TYPE"], i["MESSAGE"]);

    Executing XtractQL Query Statements

    The ECS runtime is offering a SAP query language called XtractQL. The XtractQL query language, also known as XQL, consists of ABAP and SQL syntax elements. XtractQL allows querying SAP tables, BW-Cubes, SAP Queries and executing function modules.

    It’s possible to return metadata for the objects and MDX statements can also be executed with XQL. All XQL queries are returning a data table object as result set. In case of the execution of function modules the caller must define the returning table (see sample below – INTO @RETVAL). XQL is very useful in situations where you need to handle dynamic statements. The following list shows a

    SELECT TOP 5 * FROM T001W WHERE FABKL = 'US'

    This query selects the top 5 records of the SAP table T001W where the field FABKL equals the value US.

    SELECT * FROM MARA WITH-OPTIONS(CUSTOMFUNCTIONNAME = 'Z_XTRACT_IS_TABLE')

     

    SELECT MAKTX AS [ShortDesc], MANDT, SPRAS AS Language FROM MAKT

    This query selects all records of the SAP table MAKT. The result set will contains three fields named ShortDesc, MANDT and Language.

     

    EXECUTE FUNCTION 'SD_RFC_CUSTOMER_GET'
       EXPORTS KUNNR='0000003340'
       TABLES CUSTOMER_T INTO @RETVAL;

    This query executes the SAP function module SD_RFC_CUSTOMER_GET and returns as result the table CUSTOMER_T (defined as @RETVAL).

    DESCRIBE FUNCTION 'SD_RFC_CUSTOMER_GET' GET EXPORTS

    This query returns metadata about the export parameters of the

    SELECT TOP 30 LIPS-LFIMG, LIPS-MATNR, TEXT_LIKP_KUNNR AS CustomerID
       FROM QUERY 'S|ZTHEO02|ZLIKP'
       WHERE SP$00002 BT '0080011000'AND '0080011999'

    This statement executes the SAP Query “S|ZTHEO02|ZLIKP” (name includes the workspace, user group and the query name). As you can see XtractQL extends the SQL syntax with ABAP or SAP specific syntax elements. This way you can define fields using the LIPS-MATNR format and SAP-like where clauses like “SP$00002 BT ‘0080011000’AND ‘0080011999’”.

    ERPConnect Services provides a little helper tool, the XtractQL Explorer (see screenshot below), to learn more about the query language and to test XQL queries. You can use this tool independent of SharePoint, but you need access to a SAP system.

    To find out more about all XtractQL language syntax see the product manual.

    Silverlight And Desktop Applications

    So far all samples are using the assembly ERPConnectServices.Server.Common.dll as project reference and all code snippets shown run within the SharePoint context, e.g. Web Part.

    ERPConnect Services also provides client libraries for Silverlight and desktop applications:

    ERPConnectServices.Client.dll for Desktop applications
    ERPConnectServices.Client.Silverlight.dll for Silverlight applications

    You need to add the references depending what project you are implementing.

    In Silverlight the implementation and design pattern is a little bit more complicated, since all web services will be called in asynchronously. It’s also not possible to use the DataTable class. It’s just not implemented for Silverlight.

    The runtime provides a similar class called ERPDataTable, which is used in this cases by the API. The ERPConnectServiceClient class for Silverlight provides the method ExecuteTableQueryAsync and an event called ExecuteTableQueryCompleted as callback delegate.

    public event EventHandler<ExecuteTableQueryCompletedEventArgs> ExecuteTableQueryCompleted;
    
    public void ExecuteTableQueryAsync(string tableName)
    public void ExecuteTableQueryAsync(string tableName, ExecuteTableQuerySettings settings)

    The following code sample shows a simple query of the SAP table T001 within a Silverlight client.

    First of all, an instance of the ERPConnectServiceClient is created using the URI of the ERPConnectService.svc, then a delegate is defined to handle the complete callback. Next, the query is executed, defined with a RowCount equal 10 to only return the top 10 records in the result set.

    Once the result is returned the data set will be attached to a DataGrid control (see screenshot below) within the callback method.

     

    void OnGetTableDataButtonClick(object sender, RoutedEventArgs e)
    {
      ERPConnectServiceClient client = new ERPConnectServiceClient(
        new Uri("http://<SERVERNAME>/_vti_bin/ERPConnectService.svc"));
    
      client.ExecuteTableQueryCompleted += OnExecuteTableQueryCompleted;
      client.ExecuteTableQueryAsync("T001", 
        new ExecuteTableQuerySettings { RowCount = 150 });
    }
    
    void OnExecuteTableQueryCompleted(object sender, ExecuteTableQueryCompletedEventArgs e)
    {
      if(e.Error != null)
        MessageBox.Show(e.Error.Message);
      else
      {
        e.Table.View.GroupDescriptions.Add(new PropertyGroupDescription("ORT01"));
        TableGrid.ItemsSource = e.Table.View;
      }
    }

    The screenshot below shows the XAML of the Silverlight page:

    The final result can be seen below:

    ECS Designer

    ERPConnect Services includes a Visual Studio 2010 plugin, the ECS Designer, that allows developer to visually design SAP interfaces. It’s working similar to the LINQ to SAP Designer I have written about a while ago, see article at CodeProject: LINQ to SAP.

    The ECS Designer is not automatically installed once you install the product. You need to call the installation program manually. The setup adds a new project item type to Visual Studio 2010 with the file extension .ecs and is linking it with the designer. The needed references are added automatically after adding an ECS project item.

    The designer generates source code to integrate with the ERPConnect Services runtime after the project item is saved. The generated context class contains methods and sub-classes that represent the defined SAP objects (see screenshots below).

     

    Before you access the SAP system for the first time you will be asked to enter the connection data. You may also load the connection data from SharePoint system. The designer GUI is shown in the screenshots below:

     

    The screenshot above for instance shows the tables dialog. After clicking the Add (+) button in the main designer screen and searching a SAP table in the search dialog, the designer opens the tables dialog.

     

    In this dialog you can change the name of the generated class, the class modifier and all needed properties (fields) the final class should contain.

     

    To preview your selection press the Preview button. The next screenshot shows the automatically generated classes in the file named EC1.Designer.cs:

     

    Using the generated code is simple. The project type we are using for this sample is a standard console application, therefore the designer is referencing the ERPConnectServices.Client.dll for desktop applications.

    Since we are not within the SharePoint context, we have to define the URI of the SharePoint system by passing this value into the constructor of the ERPConnectServicesContext class.

    The designer has generated class MAKT and an access property MAKTList for the context class of the table MAKT. The type of this property MAKTList is ERPTableQuery<MAKT>, which is a LINQ queryable data type.

     

    This means you can use LINQ statements to define the underlying query. Internally, the ERPTableQuery<T> type will translate your LINQ query into call of ExecuteTableQuery.

     

    That’s it!

     

    Advanced Techniques

    There are situations when you have to use the exact same SAP connection while calling a series of function modules in order to receive the correct result. Let’s take the following code:

    ERPConnectServiceClient client = new ERPConnectServiceClient();
    
    using(client.BeginConnectionScope())
    {
      ERPFunction f = client.CreateFunction("BAPI_GOODSMVT_CREATE");
    
      ERPStructure s = f.Exports["GOODSMVT_HEADER"].ToStructure();
      s["PSTNG_DATE"] = "20110609"; // Posting Date in the Document
      s["PR_UNAME"] = "BAEURLE";    // UserName
      s["HEADER_TXT"] = "XXX";      // HeaderText
      s["DOC_DATE"] = "20110609";   // Document Date in Document
    
      f.Exports["GOODSMVT_CODE"].ToStructure()["GM_CODE"] = "01";
    
      ERPStructure r = f.Tables["GOODSMVT_ITEM"].AddRow();
      r["PLANT"] = "1000";          // Plant
      r["PO_NUMBER"] = "4500017210"; // Purchase Order Number
      r["PO_ITEM"] = "010";      // Item Number of Purchasing Document 
      r["ENTRY_QNT"] = 1;          // Quantity in Unit of Entry
      r["MOVE_TYPE"] = "101";        // Movement Type
      r["MVT_IND"] = "B";            // Movement Indicator
      r["STGE_LOC"] = "0001";        // Storage Location
    
      f.Execute();
    
      string matDocument = f.Imports["MATERIALDOCUMENT"].ParamValue as string;
      string matDocumentYear = f.Imports["MATDOCUMENTYEAR"].ParamValue as string;
    
      ERPTable ret = f.Tables["RETURN"]; //.ToADOTable();
    
      foreach(var i in ret)
        Console.WriteLine("{0} - {1}", i["TYPE"], i["MESSAGE"]);
    
      ERPFunction fCommit = client.CreateFunction("BAPI_TRANSACTION_COMMIT");
      fCommit.Exports["WAIT"].ParamValue = "X";
      fCommit.Execute();
    }

    In this sample we create a goods receipt for a goods movement with BAPI_GOODSMVT_CREATE. The final call to BAPI_TRANSACTION_COMMIT will only work, if the system under the hood is using the same connection object.

     

    The runtime is not providing direct access to the underlying SAP connection, but the library offers a mechanism called connection scoping. You may create a new connection scope with the client library and telling ECS to use the same SAP connection until you close the connection scope. Within the connection scope every library call will use the same SAP connection.

    In order to create a new connection scope you need to call the BeginConnectionScope method of the class ERPConnectServiceClient.

    The method returns an IDisposable object, which can be used in conjunction with the using statement of C# to end the connection scope.

    Alternatively, you may call the EndConnectionScope method. It’s also possible to use function modules with nested structures as parameters.

    This is a special construct of SAP. The goods receipt sample above is using a nested structure for the export parameter GOODSMVT_CODE. For more detailed information about nested structures and tables see the product documentation.

    Power BI connectivity to SAP BusinessObjects BI

    Microsoft and SAP are jointly delivering business intelligence (BI) interoperability in Microsoft Excel, Microsoft Power BI for Office 365, and SAP BusinessObjects BI.

     

    Microsoft Power Query for Excel seamlessly connects to SAP BusinessObjects BI Universes enabling users to access and analyze data across the enterprise and share their data and insights through Power BI.

    This connectivity drives a single version of truth, instant productivity, and optimized business performance for your organization.

    Image

    Download Microsoft Power Query Preview for Excel

    Preview contains SAP BusinessObjects BI Universe connectivity.

    Details

    Microsoft Power Query Preview for Excel, providing SAP BusinessObjects BI Universe connectivity, is an add-in that provides a seamless experience for data discovery, data transformation and enrichment for Information Workers, BI professionals and other Excel users. This preview provides an early look into the upcoming SAP BusinessObjects BI Universe connectivity feature. As with most previews, this feature may appear differently in the final product.

    System Requirements

    Supported operating systems

    Windows 7, Windows 8, Windows 8.1, Windows Server 2008, Windows Vista

    • Windows Vista (requires .NET 3.5SP1)
    • Windows Server 2008 (requires .NET 3.5 SP1)
    • Windows 7
    • Windows 8
    • Windows 8.1

    The following Office versions are supported:

    • Microsoft Office 2010 Professional Plus with Software Assurance
    • Microsoft Office 2013 Professional Plus, Office 365 ProPlus or Excel 2013 Standalone

    Microsoft Power Query Preview for Excel requires Internet Explorer 9 or greater.

    Microsoft Power Query Preview for Excel is available for 32-bit (x86) and 64-bit (x64) platforms, your selection must match architecture of the installed version of Office.

    Installation Instructions

    Download the version of the Power Query add-in that matches the architecture (x86 or x64) of your Office installation. Run the MSI installer and follow the setup steps.

    Access and analyze your trusted enterprise data

    Learn how Microsoft and SAP deliver a combination of trusted enterprise data and familiar market leading tools.

    Single version of truth

    Deliver the latest, accurate and trusted data from across the enterprise, such as from SAP applications, directly into the hands of users in Microsoft Excel. They no longer need to constantly copy and paste or import data using a manual process leading to inaccuracy. Users can instead focus on leveraging their knowledge to analyze data from within and outside your organization. They can get answers and uncover new insights to better deal with the challenges facing your organization, eliminating costly decisions based on inaccurate data.

    Instant productivity

    Users can continue to work in their familiar Microsoft Excel environment with access to business friendly terms from SAP BusinessObjects BI Universes at their fingertips, allowing for deeper analysis on their own. Using familiar tools enables them to easily integrate data and insights into existing workflows without the need to learn new complex tools and skills. Any uncovered data and insights can be kept up to date with no hassle refreshing from on-premises and the cloud, increasing productivity.

    Optimized business performance

    Leveraging existing investments from both companies together enables your organization to unlock insights faster and react accordingly. Your organization can identify patterns, cost drivers, and opportunities for savings in an agile, accurate, and visual manner. Specific trends and goals can be measured while having visually attractive and up to date dashboards. Relying on trusted enterprise data reduces costs and increases profitability by allowing faster, better, and timelier decisions. All of this drives broader BI adoption to create an information driven culture across your organization.

    Unlocked data and insights

    Drive trusted enterprise data and insights from an SAP BusinessObjects BI Universe throughout your organization by sharing and collaborating with Power BI from anywhere. Anyone can create a collaborative BI site to share data and insights relying on the latest data from either on-premises or the cloud using scheduled refreshing. Users no longer have to struggle to think up and answer every single question in advance, instead interactively investigating data when they need it through natural language Q&A. Better yet, they can stay connected with mobile access to data and insights generating a deeper understanding of the business and communicating more effectively from anywhere.

     

     

    SAP NetWeaver and Hyper-Threading on Windows Servers : To be or Not to be

    Here are the key messages related to SAP NetWeaver and Hyperthreading ( HT ) now called
    Simultaneous MultiThreading ( SMT ) derived from different sources as well as internal lab tests
    done for the WS2012 SAP First Customer Shipment program. In addition I incorporated very
    valuable input from Juergen Thomas who published many blogs and papers about SAP on the
    Microsoft platform :

    1. Always keep in mind : a CPU thread is NOT equal to a core !  ( also see walk-through section at the end )
    2. Sizing based on SAPS which is done by Hardware vendors for certain server models usually
      includes SMT. Looking at the latest published SAP SD benchmarks one will realize immediately
      that SMT was turned on according to the CPU Information ( # processors / # cores / # threads ).
      The goal is to achieve the maximum amount of SD workload. Including SMT for sizing implictly
      means that customers will turn it on

    3. To use Hyper-V on a Server with more than 64 logical processors ( e.g. 40 cores and SMT turned on )
      needs to use Windows Server 2012. Hyper-V of Windows 2008 ( R2 ) has a limit of 64 logical CPUs,
      Windows 2008 R2 being able to address 256 CPU threads in a bare-metal deployment

    4. When using the latest OS and application releases the general suggestion regarding SMT is to
      always turn it on. It either helps or won’t hurt. Turning SMT on or off requires a reboot as it’s a
      BIOS setting on the physical host

    5. How much SMT will help depends on the application workload. While there is a proven benefit in
      SAP SD Benchmarks as well as in many other benchmarks, we know customer tests where there
      was no difference between running a virtualized SAP application Server on WS2012 Hyper-V with
      or without SMT. Conclusion is that the effect/impact of SMT is pretty much dependent on the
      individual customer scenario
       

    General Information about Hyper-Threading

    Being around for many years Hyper-Threading ( HT ) now called Simultaneous MultiThreading ( SMT )
    is a well-known feature of Intel processors. Looking on the Internet one can easily find a lot of information
    and technical descriptions about it like this one :

    http://software.intel.com/en-us/articles/performance-insights-to-intel-hyper-threading-technology/
    While there have been some issues in early days the recommendations related to the more recent OS or
    application releases is usually to have SMT turned on as it does increase the throughput  achievable
    with an application on a single server. Nevertheless the question about the impact of SMT on SAP NetWeaver
    shows up again when looking at the execution speed of a single request handled by a single CPU thread.
    Especially as we are pushing WS2012 Hyper-V – customers wonder what the performance characteristics
    are with a combination of SMT and virtualization.

    In this blog I try to summarize the status quo and give some guidance based on all the statements and
    test results and experiences which are around.

    Single-Thread Performance

    I personally would like to separate the SMT discussion from the single-thread performance discussion.
    When People talk about the latter one it is usually about the trend in processor technology to increase
    the number of cores instead of increasing the clock rate. In these discussions the often unspoken
    assumption is 1 thread per core. Sure – there are very good reasons for it. But as one can read under
    the following links only applications which are able to use parallelism will fully benefit from the multi-core
    design. As a consequence certain SAP batch jobs which are dependent on high single-thread performance
    might not improve a lot or not at all when the underlying hardware gets upgraded to the next processor
    version which has more cores per CPU. A customer example of this effect can be found here :
    http://blogs.msdn.com/b/saponsqlserver/archive/2010/01/24/performance-what-do-we-mean-in-regards-to-sap-workload.aspx

    SAP NetWeaver is not a multi-threaded application. But in SAP it’s of course possible trying to
    parallelize processing on a Business process level. Just think about payroll parallelism in SAP.

    Some basic articles around single-threaded CPU performance and multi-core processing
    can be found here :

    http://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance

    http://en.wikipedia.org/wiki/Multi-core_processor

    http://iet-journals.org/archive/2012/may_vol_2_no_5/846361133715321.pdf

    Could SMT hurt in some cases ?

    is in principle the wrong question. The correct question should be : What is the effect and impact
    of SMT with different applications and different configurations or scenarios ?
    Understanding these effects and impacts will make it possible to adapt and get the maximum out
    of an investment in a specific hardware model.
    There might be some outdated messages or opinions around based on experiences from the early
    days which are no longer valid. In other cases I personally wouldn’t call it an issue of SMT but
    wrong sizing or overlooking some documented restrictions. First let’s look again at a general statement :

    http://software.intel.com/en-us/articles/performance-insights-to-intel-hyper-threading-technology/

    “Ideal scheduling would be to place active threads on cores before scheduling on threads on
    the same core when maximum performance is the goal. This is best left to the operating system.
    All multi-threaded operating systems support Intel HT Technology, while later versions have more
    support for scheduling threads in the most ideal manner to maximize performance gains”

     

    Here are some more details :

    1. single-thread / single-core performance

    In SAP note 1612283 section “1.1 Clock Speed” you will find the following statement :
    “If you need to speed up a single transaction or report you might try to switch off
    Hyperthreading”

    Based on some testing I would like to differentiate this a little bit further. As long as the
    number of running processes / threads is <= the number of cores the OS / hypervisor should
    be smart enough to distribute the workload over all the cores. In this case there shouldn’t be
    any effect/impact by the fact that SMT is on or off. Based on the basics of SMT as described
    in the Intel article named above, expectation is that with the number of running processes
    /threads exceeding the # of cores, the performance/throughput of a single CPU thread,
    dependent on the load, is decreasing.

    1. parallelism
      From an OS perspective one shouldn’t see any major issues with SMT anymore. It Looks different
      though when it comes to the application. One potential issue could arise if an application doesn’t realize
      that the available logical CPUs are mapped to SMT threads and not cores. This could lead to wrong
      assumptions. Here is an example from SQL Server :

    http://support.microsoft.com/kb/2023536

    “For servers that have hyper-threading enabled, the max degree of parallelism value should not
    exceed the number of physical processors”
    ( the term “processors” being used related to physical cores in this article )

    It’s related to the two items above. Parallelism on a SQL statement level means that the SQL Server
    Optimizer expects all logical CPUs to be of the same type. The important question is if this is just not
    as fast as if there would be as many cores as logical CPUs or if it becomes in fact slower than without
    SMT
    3. virtualization

    Another topic is running VMs on Hyper-V with SMT turned on on the underlying physical host. It’s
    again not different from the items mentioned before. Inside a VM an application might not be aware
    of the nature of a virtual CPU. It’s not just SMT. Depending on the configuration ( e.g. over-
    commitment ) and the capabilities of an OS/hypervisor a Virtual Processor will correspond only to a
    “fraction” of a real Physical CPU.

    Internal lab tests on WS2012 Hyper-V have proven the statement I quoted at the beginning of this
    section. As long as there are enough cores available the workload will be optimally distributed.
    A perfect way to show this is to increase the number of virtual CPUs inside a VM step by step
    while monitoring the CPU workload on the host.

    The CPU load screenshots further down were taken from perfmon on a WS2012 host where SMT
    was turned on. The server had 8 cores and due to SMT 16 logical processors. The server also had
    two NUMA nodes  ->  4 cores / 8 threads each. The SAP test running inside a VM ( guest OS was
    Windows 2008 R2 ) was absolutely CPU-bound. The scenario looked like this :

    a, the test started with two Virtual processors ( VP ) and the workload was increased until both VPs
    were 100% busy

    b, then the number of VPs was increased to four to see if it was possible to double the
    workload. Scalability was very good in this case because the workload could still be
    distributed over all four cores in one single NUMA node of the host server

    c, but going from 4 VPs to 6 VPs changed the picture. Hyper-V has improved NUMA
    support and per default sets max VPs per NUMA node to the # of Logical Processors.
    ( LP ) of the NUMA node ( 8 on the test Hardware ).
    And as 6 VPs is still < 8 the whole workload of the VM still ended up on one single NUMA
    node. On the other side due to SMT it was no longer possible to achieve an almost 1:1
    VP-to-physical-core mapping. This is a situation where you will see still an improved
    throughput compared to four VPs but it’s far away from linear scalability we achieved when
    going from 2 VPs to 4 VPs.

    Keep in mind that it’s NOT possible to configure processor affinity on Hyper-V to achieve
    a fixed VP-physical-core mapping. But setting the “reserve” value in WS2012 Hyper-V
    Manager to 100 has basically the same effect. See also the blog from Ben Armstrong :

    http://blogs.msdn.com/b/virtual_pc_guy/archive/2009/09/21/processor-affinity-and-why-you-don-t-need-it-on-hyper-v.aspx

    d, next step was to adapt the setting for the VM in Hyper-V Manager. It allows to define max
    VPs per NUMA node. Setting this value to four forced Hyper-V to distribute the workload
    over two NUMA nodes when using 6 VPs in the VM. Now it was again possible to achieve
    basically a 1:1 VP-to-physical-core mapping. Scalability looked fine and because it was
    totally CPU-bound the disadvantage of potentially slower memory access didn’t matter.
    The advantage of getting more CPU power outweighed the memory access penalty by far.

    The following pictures and screenshots will visualize the four items above :

     

    Figure 1 : as long as SMT is turned off and a VM won’t span multiple numa nodes the virtual processors
    will be mapped to the cores of one single numa node on the physical host

     

    Figure 2 : once SMT is turned on the Virtual Processors of a VM will be mapped to Logical Processors
    on the physical host. By Default WS2012 Hyper-V Manager will set the maximum # of Virtual
    Processors per numa node according to the hardware layout. In this example it means a max
    of 8 Virtual Processors per numa node

    Figure 3 : configuring 4 Virtual Processors in a VM while SMT is turned on means that these 4 VPs
    will be mapped to 8 Logical Processors on the physical host. This allows the OS/Hypervisor
    to make sure that the workload will be distributed over all 4 physical cores in an optimal way
    similar to having SMT turned off


    Figure 4 : perfmon showed that the workload which kept four virtual processors busy inside a VM was
    distributed over eight Logical processors on four cores within one NUMA node on the
    physical host

     

    Figure 5 : what happens when adding two additional Virtual Processors inside the
    single VM ? Because 6 VPs is still less than the default setting of a max
    of 8 VPs per single numa node the whole workload will be still mapped
    to 8 Logical Processors which correspond to 4 cores within one numa
    node


    Figure 6 : increasing the workload the same way as before when going from two virtual processor to
    four VPs by configuring six VPs inside the VM caused a super-busy single NUMA node using
    almost all the threads 100%. Means each of the single physical cores of this NUMA node had
    to engage more severly the two SMT threads it represented to the OS/Hypervisor. Therefore
    the scalability for going from four VPs to 6 VPs looked not great compared to going to four
    VPs from two VPs.

    Figure 7 : the default setting regarding # of virtual processors per numa node in WS2012
    Hyper-V Manager can be changed. Setting the number low enough will force
    the Hypervisor to use more than one numa node

     

     

     


    Figure 8 : changing the “max VPs per NUMA node” setting in Hyper-V Manager ( 2012 ) to four
    forced Hyper-V to use the second NUMA node. This allowed again basically a 1:1
    VP-to-physical-core mapping and scalability looked fine again
    Conclusion :

    The references as well as the experiences shown make it obvious that the effects of SMT are related
    to sizing and configuration of SAP deployments as well as to set expectations and SLAs accordingly.

    It is proven that with having SMT configured on a Hyper-V host or on a SAP bare metal deployment,
    the overall throughput of a specific server is increasing including the power/throughput ratio. Both
    are usually the goals we follow when specifying hardware configurations for SAP deployments.

    In terms of using SMT for Hyper-V hosts, one clearly needs to define the goals of deploying SAP
    components in VMs. Is the goal again to maximize the available capacity of servers, then having
    SMT enabled is the way to go. Means one would deploy as many VPs as there are Logical Processors
    on the host server and accept that there might be performance variations dependent on the load over
    all VMs or the fact that a VM has more VPs than the # of physical cores in one NUMA node of the host
    server. SLAs towards the business units would then take such variations into account.

    Walk-through TTHS – Tray Table Hyper-Seating

     

    One thing which will be repeated again and again in all the articles about SMT is the fact that a
    CPU thread is NOT equal to a core. To visualize this specific point and to make it easy to remember
    I would like to compare SMT with TTHS – Tray Table Hyper-Seating as shown on the following
    six Pictures :

     

    Figure 1 :  you have four comfortable seats and four passengers. Everyone is happy.

     

    Figure 2 :  now you want to get more than four passengers into the car and the idea is to introduce
    TTHS – Tray Table Hyper Seating. This will allow to put two passengers on one seat. But
    it’s pretty obvious that it’s not so comfortable anymore. Especially one of the two
    passengers cannot enjoy the cozy seat surface.
     

    Figure 3 :  therefore the driver should be smart enough to let passengers enjoy the cozy seat surface
    as long as seats are available despite the fact that TTHS is turned on

    Figure 4 :   at some point though when you want to put six passengers into the 4-seat car two of them
    have to get on the TTHS spots. This is when issues might evolve

     

    Figure 5 :  of course one could turn TTHS off again and share two seats the traditional way. While this
    might work too it’s very obvious that it’s not perfect

     

    Figure 6 :  conclusion :  if it’s a hard requirement that every passenger has his own seat to fully enjoy
    the cozy seat surface then there is no other way than to take a different car with an
    approrpiate number of seats

    How authentication works in Duet Enterprise 2.0

    How authentication works in Duet Enterprise 2.0

    Duet Enterprise 2.0 stores all business processes and data in the SAP system while letting SharePoint users access the processes and data from SharePoint websites and Outlook 2013. Because SAP and SharePoint authenticate users differently, Duet provides a single sign-on authentication model that authenticates each user individually.

    It’s helpful to understand the following things before you look at the overall authentication process.

    • A user logs on to SharePoint by using their SharePoint user identity. This can be either forms-based authentication or credentials stored in Active Directory Domain Services (AD DS), but is typically associated with a user account stored in AD DS.
    • The SAP environment can’t authenticate a user’s SharePoint identity. Instead, a Duet Enterprise component installed on the SharePoint Server 2013 farm swaps the user’s Windows credentials for a user certificate that SAP NetWeaver uses to authenticate the user. When Duet Enterprise 2.0 is installed, the SAP administrator creates a trust relationship with the DuetRoot Certificate (an X.509 Root Authority certificate), which is stored in the SharePoint Secure Store Service. This certificate is used to create a certificate for each individual user on the fly.
    • Information in an SAP environment can’t be secured with Windows credentials or SharePoint credentials (which in this case would be the user’s SharePoint identity). Instead, information is secured in SAP using SAP user accounts. When deploying Duet Enterprise 2.0, an SAP administrator maps each SharePoint user account to a unique SAP user. This way, a user who logs into a SharePoint website can access data that’s stored in SAP without getting an extra login prompt.

    The following picture shows a high-level view of authentication flow in a Duet Enterprise 2.0 environment. It shows the steps that occur when a SharePoint user accesses SAP information from a SharePoint site.

    Tip Tip:
    To see this picture and the following list that describes the process without having to scroll, download the Authentication flow in Duet Enterprise 2.0 poster.

     

    Figure: Duet Enterprise 2.0 authentication

    Secure objects in SharePoint by using SAP rolesThe following list describes the steps shown in the preceding picture. This picture assumes that a SharePoint user has requested data that’s stored in the SAP environment.

    A.   A user logs on to a Duet Enterprise 2.0-enabled SharePoint website using his SharePoint user identity. Because the website contains an external list or Web Part that surfaces SAP data, the request is sent to the Business Connectivity Services runtime in the SharePoint farm.

    B.   The Business Connectivity Services runtime invokes the Duet Enterprise 2.0 OData Extension Provider.

    C.   The Duet Enterprise 2.0 OData Extension Provider gets the DuetRoot Certificate from the Secure Store.

    D.   The Duet Enterprise 2.0 OData Extension Provider uses the DuetRoot Certificate to create an X.509 user certificate and sends the certificate to the Business Connectivity Services runtime.

    E.   The Business Connectivity Services runtime sends the request with the user certificate to the SAP NetWeaver Gateway component of SAP NetWeaver in a request packet.

    Tip Tip:
    SAP NetWeaver with the SAP NetWeaver Gateway component installed is also known as SAP NetWeaver Gateway.

     

    F.   Because SAP NetWeaver trusts the DuetRoot Certificate that was used to create the user certificate, SAP NetWeaver can authenticate the user and look up the SAP user who is mapped to the SharePoint user who is identified by the certificate.

    G.   The SAP user account that’s mapped to the SharePoint user is returned to SAP NetWeaver.

    H.   SAP NetWeaver uses the SAP user account to request access to the requested information in the SAP system and, if the user is authorized to access the information, the requested information is sent to SAP NetWeaver Gateway.

    I.   SAP NetWeaver Gateway sends the reply as a response packet to the Business Connectivity Services runtime on the on-premises SharePoint farm.

    J.   The Business Connectivity Services runtime passes the information to the SharePoint user. In this case, to the website from which the user has requested the information.

    note Note:
    The two-way connection between the SharePoint Server farm and SAP NetWeaver is secured by using two Secure Sockets Layer (SSL) certificates. One certificate is bound to a SharePoint web application and trusted by the SAP administrator. The other certificate is bound to SAP NetWeaver and trusted by the SharePoint administrator.

     

    Using SAP roles to access SharePoint objects

    In the enterprise, the tasks that a user does are usually related to that user’s role. Because of this, it’s handy to grant permissions to resources, such as list items, websites, and documents, based on SAP roles. Conceptually, SAP roles are like SharePoint groups except that they’re created and managed in SAP.

    In SAP NetWeaver, users are assigned one or more roles, such as Sales Representative, Project Manager, Executive, and Human Resources Specialist. SAP roles can be broad, such as All Sales Managers, or narrow, such as Sales Managers Eastern Region.

    In Duet Enterprise 2.0, these SAP roles can be used to grant permissions in SharePoint Server. Anything you can set permissions on in SharePoint Server can be assigned permissions using SAP roles.

    This includes objects directly related to Duet Enterprise 2.0, such as SAP reports, external lists, actions on external content types, and any general and securable SharePoint Server objects, such as websites or document libraries.

    After a role is granted permissions to an object, any user who is assigned that role will then have permissions to use that object.

    If you remember nothing else about RoleSync, remember that SAP NetWeaver admins assign SAP users to roles and they also assign SharePoint users to SAP users. This effectively assigns one or more SAP roles to SharePoint users.

    Duet Enterprise 2.0 uses the Duet Enterprise Profile Synchronization Timer Job feature to bring the user role assignments from the SAP system into the SharePoint user profile store. Duet Enterprise 2.0 also uses the Duet Enterprise Claims Provider to help manage the role-based permissions to securable objects in SharePoint Server.

    note Note:
    Think of role synchronization as a one-way street. Users’ roles that are defined in the SAP system are brought into the SharePoint user profile store. No properties in the SharePoint user profiles are sent from SharePoint back to SAP.

     

    During role synchronization, the set of SAP users is imported into the SharePoint user profile store by using Business Connectivity Services. For each SAP user who has a related user profile in SharePoint, all of the SAP roles assigned to that user are listed in the user profile store.

    Role synchronization connects from SharePoint Server to an external system on the SAP side named “SAPUsersService.” This external system sends the user-to-roles mappings to the SharePoint user profile store.

    After role synchronization is completed, you’ll see a new field, called SAP Roles at the bottom of the User Profile page. The SAP roles assigned to the user are separated by semicolons.

    SAP Roles as seen on a User Profile page.

    SAP Roles as seen on a User Profile page in SharePoint.Role synchronization is typically scheduled to be run on a schedule by using the Duet Enterprise Profile Synchronization Timer Job. You decide how often to synchronize roles and how many users to import at a time.

    After roles are synchronized with the SharePoint user profile store, users and administrators can grant access to SharePoint securable objects using the SAP roles. Before this capability is available, a SharePoint farm administrator needs to activate the Duet Enterprise SAP Roles Claims Provider feature at the farm level which makes the claims provider available.

    note Note:
    When a user’s role is changed in the SAP system, the change can take some time (up to 10 hours) to be propagated to the SharePoint system. This might temporarily prevent users from being authorized if their roles have changed since the last sync job.

    Business Connectivity Services (BCS) client side logging for Office 2010 when working with Duet Enterprise

    Use tracing on the client (SharePoint Server 2010) 
    http://technet.microsoft.com/en-us/library/ff700209.aspx

    Duet

    Here are the condensed steps that I use on the client when troubleshooting issues related to taking Duet Enterprise lists offline. 
    NOTE: You must be an Administrator on the computer to use the tracing.

    Create the Data Collector Set:

    1. Launch Perfmon (Start –> Run –> Perfmon) 
    2. Expand Data Collector Sets 
    3. Right-click on “User Defined” and choose “New Data Collector set” 
    4. Give it a name, I use “BCS” 
    5. Choose the “Create Manually” option and click Next 
    6. Check the box labeled “Event trace data” and click Next 
    7. Next to the Providers box, click the “Add…” button and wait for the list to load. 
    8. Select the item named “Microsoft-Office-Business Connectivity Services” and click “OK” 
    9. Leave everything at the default values and click “Finish” 
    10. You should now see your “BCS” Data Collector Set listed under “User Defined” in Perfmon.

     

    Start collecting the trace information:

    1. Select the “BCS” data collector set you created previously. 
    2. Click the “Start the Data Collector Set” button. 
    3. Reproduce the issue. 
    4. Click the “Stop the Data Collector Set” button to stop the trace. 
    5. A trace file with a .etl extension should be created in a path like this: 
        C:\PerfLogs\Admin\BCS\MACHINENAME_20110824-000001\DataCollector01.etl

    View the trace:

    1. Launch Event Viewer (Start –> Run –> eventvwr.msc) 
    2. Action menu  –> “Open Saved Log…” 
    3. In the “Open Saved Log” dialog, navigate to the .etl trace file you created earlier and choose Open. 
    4. When prompted to convert to the new event log format choose Yes. 
    5. Change the display name if you would like then click OK 
    6. You should see the trace information in event viewer.

    How To : Customize the Duet Workflow Task form in InfoPath 2013

    Contents

    • Introduction
    • Displaying the SAP business properties
    • Adding and deleting controls on the form
    • Adding heading images to the form and applying a theme

    Duet

    Introduction

    After a task site is published through SharePoint Designer, the task form ApprovalProcess.xsn is generated. The form has a default layout. But we may also want to display the SAP business properties, add some more controls relevant to the use of the form, or delete some irrelevant controls. We may also want to give a nice look and feel to the form. We can do these customizations easily with the help of InfoPath 2013.

    Scenario

    We have published a task site of the task type TestTask using SharePoint Designer 2013. The task form has the default layout shown below. We want to customize the form to include a SAP business property, add a control, remove a control, add a heading image, and apply a theme.

    Figure 1. TestTask task form with default layout

    Figure 1. TestTask task form with default layout

    Displaying the SAP business properties

    Prerequisite: We can display the SAP business properties in the workflow task form provided that we have included the properties in the Extended Business Properties text box while creating the task site.

    Steps:

    1. In SharePoint Designer, click ApprovalProcess.xsn.

    Figure 2. ApprovalProcess.xsn in SharePoint Designer

    Figure 2. ApprovalProcess.xsn in SharePoint Designer

    InfoPath Designer opens with an auto-generated layout of the form.

    Figure 3. InfoPath Designer with auto-generated layout of the TestTask form

    Figure 3. InfoPath Designer with auto-generated layout of the TestTask form

    2. Insert a new row, wherever you want, for the business property LeaveDaysUsedTillToday that you want to display in the form.

    a. In the first column, enter the field name as you want to see it displayed in the form, for example,Leaves Used Till Today.

    Figure 4. Entering field name in the first column
    Figure 4. Entering field name in the first column

    b. In the second column, we need to get the value for the business property LeavesUsedTillTodayfrom the Workflow Business Document Library. Thus, we need to create a secondary data connection with the Workflow Business Document Library.

    3. Create a secondary data connection with the Workflow Business Document Library as follows: 

    a. Under Actions, click Manage Data Connections.

    Figure 5. Clicking Manage Data Connections in InfoPath
    Figure 5. Clicking Manage Data Connections in InfoPath

    The Data Connections dialog box appears.

    Figure 6. Data Connections dialog box

    Figure 6. Data Connections dialog box

    b. In the Data Connections dialog box, select Context  from the list of Data Connections for the form template, and click Add. The Data Connection Wizard starts.

    Figure 7. Data Connection Wizard

    Figure 7. Data Connection Wizard

    c. Click Next without changing any settings. The wizard now asks for the source of data. SelectSharePoint library or list as the source of data.

    Figure 8. Selecting SharePoint library or list in the Data Connection Wizard

    Figure 8. Selecting SharePoint library or list in the Data Connection Wizard

    d. Click Next. The wizard now prompts you to enter the location of the SharePoint site.

    e. Enter the URL of the task site, and click Next.

    Figure 9. Entering task site location in the Data Connection Wizard

    Figure 9. Entering task site location in the Data Connection Wizard

    f. Select the Workflow Business Data Document Library for the data connection, and click Next.

    Figure 10. Selecting Workflow Business Data Document Library for the data connection

    Figure 10. Selecting Workflow Business Data Document Library for the data connection

    g. Select the Title and LeavesUsedTillToday fields, and click Next. The Title field will help us filter the data corresponding to a task from the Workflow Business Data Document Library. The Title field is the concatenation of the Related Content field in the main data connection and the string “.xml“.

    Figure 11. Selecting the Title field and LeaveDaysUsedTillToday field

    Figure 11. Selecting the Title field and LeaveDaysUsedTillToday field

    h. Click Next.

    Figure 12. Data Connection Wizard

    Figure 12. Data Connection Wizard

    i. Click Finish.

    Figure 13. Finishing the Data Connection Wizard

    Figure 13. Finishing the Data Connection Wizard

    j. Close the Data Connections dialog box that now has the data connection to the Workflow Business Data Document Library.

    Figure 14. Data Connections dialog box with the new data connection

    Figure 14. Data Connections dialog box with the new data connection

    4. Click in the second column of the new row that we inserted in step 3. On the Home tab in the ribbon, click Calculated Value (fx) button in the Controls pane.

    Figure 15. Choosing Calculated Value in InfoPath

    Figure 15. Choosing Calculated Value in InfoPath

    The Insert Calculated Value dialog box appears.

    5. Click the fx button next to the XPath text box.

    Figure 16. Insert Calculated Value dialog box

    Figure 16. Insert Calculated Value dialog box

    The Insert Formula dialog box appears as shown in the following figure.

    6. Click Insert Field or Group.

    Figure 17. Insert Field or Group button

    Figure 17. Insert Field or Group button

    The Select a Field or Group dialog box appears, as shown in the following figure.

    7. Click Show advanced view.

    Figure 18. Show advanced view link

    Figure 18. Show advanced view link

    Now, we have the option to select the data connection also. 

    Figure 19. Select a Field or Group dialog box

    Figure 19. Select a Field or Group dialog box

    8. Select the secondary data connection to the Workflow Business Document Library from the drop-down list.

    Figure 20. Choosing a secondary data connection

    Figure 20. Choosing a secondary data connection

    9. Expand the dataFields tree structure until you see LeaveDaysUsedTillToday. SelectLeaveDaysUsedTillToday. Since we want to get only the business property for the corresponding task, we need to filter the data received from the data connection. Click Filter Data.

    Figure 21. Filter Data button

    Figure 21. Filter Data button

    10. The Filter Data dialog box appears. Click Add.

    Figure 22. Add button in the Filter Data dialog box

    Figure 22. Add button in the Filter Data dialog box

    The Specify Filter Conditions dialog box appears.

    Figure 23. Specify Filter Conditions dialog box

    Figure 23. Specify Filter Conditions dialog box

    11. Specify the filter conditions as follows:

    a. In the first drop-down list, choose Select a field or group.

    Figure 24. Choosing Select a field or group
    Figure 24. Choosing Select a field or group

    b. Choose Workflow Business Data Document Library as the data source.

    c. Expand the dataFields tree structure until you see Title. Select Title, and then click OK.

    Figure 25. Title in the Select a Field or Group dialog box

    Figure 25. Title in the Select a Field or Group dialog box

    d. In the second drop-down list in the Specify Filter Conditions dialog box, select is equal to.

    e. In the third drop-down list in the Specify Filter Conditions dialog box, select Use a formula.

    Figure 26. Selecting Use a formula in the list

    Figure 26. Selecting Use a formula in the list

    f. The Insert Formula dialog box opens. Click Insert Function.

    Figure 27. Insert Function button

    Figure 27. Insert Function button

    The Insert Function dialog box opens.

    Figure 28. Insert Function dialog box

    Figure 28. Insert Function dialog box

    g. Select Text in the Categories list, and then select concat in the Functions list. Click OK.

    Figure 29. Choosing category and function

    Figure 29. Choosing category and function

    The formula corresponding to the selection appears in the Insert Formula dialog box.

    Figure 30. Concat Formula prototype (skeleton) in the Insert Formula dialog box

    Figure 30. Concat Formula prototype (skeleton) in the Insert Formula dialog box

    h. Double-click the first argument in the concat function. The Select a Field or Group dialog box opens. Under the Main data connection, expand the dataFields tree structure till you see Related Content. Select the subfield :Description under Related Content. Click OK.

    Figure 31. :Description subfield under Related Content

    Figure 31. :Description subfield under Related Content

    i. Write the string “.xml” as the second argument in the concat function. Delete the comma following the second argument and the third argument.

    The updated formula is as shown in the following figure. Click OK.

    Figure 32. Updated concat formula (with provided arguments) in Insert Formula dialog box

    Figure 32. Updated concat formula (with provided arguments) in Insert Formula dialog box

    j. Click OK in the dialog boxes in the order: Specify Filter Conditions, Filter Data, Select a Field or Group.

    The final overall formula appears in the Insert Formula dialog box. (This dialog box was opened in step 5 and is still open)

    Figure 33. Final formula in the Insert Formula dialog box

    Figure 33. Final formula in the Insert Formula dialog box

    12. Click OK in the Insert Formula dialog box (shown above) to return to the Insert Calculated Valuedialog box where the XPath corresponding to our selections has been updated.

    Figure 34. Updated XPath in the Insert Calculated Value dialog box

    Figure 34. Updated XPath in the Insert Calculated Value dialog box

    Click OK.

    13. Click the File tab on the ribbon. Click Quick Publish.

    Figure 35. Publish your form

    Figure 35. Publish your form

    14. The Save As dialog box opens.

    Figure 36. Saving the form template

    Figure 36. Saving the form template

    15. Click Save. The Microsoft InfoPath dialog box stating the successful publishing of the form template appears. Click OK.

    Figure 37. Form template published successfully

    Figure 37. Form template published successfully

    16. Open the task site and look up any of the tasks. The task appears as shown in the following figure. The SAP business property LeavesUsedTillToday has the value 10 in this task.

    Figure 38. Task on the task site with LeavesUsedTillToday business property

    Figure 38. Task on the task site with LeavesUsedTillToday business property

    Adding and deleting controls on a form

    Suppose we want to add a control—for example, ID—from the main data connection to the workflow task form, and delete the control Consolidated Comments from the form.

    1. Insert a new row for the ID field.

    Figure 39. Inserting a new row for the ID field

    Figure 39. Inserting a new row for the ID field

    2.  Drag the ID field from the Fields task pane onto the canvas. The label for the control appears automatically in the left column when you drag the field into the right column of the table. However, this is true only if you highlight both columns when you release the mouse.

    Figure 40. ID field in right column

    Figure 40. ID field in right column

    3. Delete the row containing the control for Consolidated Comments.

    Figure 41. Consolidated Comments row deleted

    Figure 41. Consolidated Comments row deleted

    3. Click the File tab on the ribbon. Click Quick Publish. The Microsoft InfoPath dialog box stating the successful publishing of the form template appears.

    4. Click OK.

    5. Open the task site and look up any of the tasks. The task appears as shown in the following figure. The task has the ID field with value 1 and no Consolidated Comments control.

    Figure 42. Task on the task site with ID control and without Consolidated Comments control
    Figure 42. Task on the task site with ID control and without Consolidated Comments control

    Adding heading images to the form and applying a theme

    1. Place your cursor in the title area of the page layout. Add a title—for example, MyTask—in the required format and font.

    Figure 43. Title area of the page layout

    Figure 43. Title area of the page layout

    2. Add the heading image to the form by inserting a picture from the Insert tab on the ribbon.

    3. On the Page Design tab, apply the Professional – Standard theme. The easiest way to select the theme is to expand the Themes gallery by clicking the arrow at the lower-right corner. Professional – Standard is the first theme in the Professional section.

    Figure 44. Page Design tab

    Figure 44. Page Design tab

    The title, heading image, and page design should now resemble the following figure.

    Figure 45. New title, heading image, and page design

    Figure 45. New title, heading image, and page design

    4. Click the File tab on the ribbon. Click Quick Publish. The Microsoft InfoPath dialog box stating the successful publishing of the form template appears.

    5. Click OK.

    6. Open the task site and look up any of the tasks. The task appears as shown in the following figure. The task has the desired heading image and theme.

    Figure 46. Task on the task site with desired heading image and theme

    Figure 46. Task on the task site with desired heading image and theme

    2 Great Books to start with for Developers who want to learn SAP NetWeaver

     

    In this blog i want to show you some good books which i recommend for Developers looking to specialise in SAP NetWeaver


    SAP NetWeaver: The Official Guide


    Link to SAP Press

    All you want to know about SAP NetWeaver can you read in this book, it is a must for beginners. Learn on 4 detailed Customer examples which technologies can be included in NetWeaver. This book is mostly theoretically, there is no difficult program code included. If you want to program stuff for SAP NetWeaver, first read this book to understand what NetWeaver can do.

    Developer’s Guide to SAP NetWeaver Portal Applications


    Link to SAP Press

    Developing Components for SAP NetWeaver Portal is not an easy task. If you want to know how to do this, i recommend this book. For a quick-start head to chapter 10, there are some good examples with many screenshots and good instructions. You learn to work with the SAP NetWeaver Developer Studio, create Components and deploy them to the SAP NetWeaver Portal.

    Using SharePoint FAST to unlock SAP data and make it accesible to your entire business

    An important new mantra is search-driven applications. In fact, “search” is the new way of navigating through your information. In many organizations an important part of the business data is stored in SAP business suites.
    4336.SP2013SearchArchitecture[1]
    A frequently asked need is to navigate through the business data stored in SAP, via a user-friendly and intuitive application context.
    For many organizations (78% according to Microsoft numbers), SharePoint is the basis for the integrated employee environment. Starting with SharePoint 2010, FAST Enterprise Search Platform (FAST ESP) is part of the SharePoint platform.
    All analyst firms assess FAST ESP as a leader in their scorecards for Enterprise Search technology. For organizations that have SAP and Microsoft SharePoint administrations in their infrastructure, the FAST search engine provides opportunities that one should not miss.

    SharePoint Search

    Search is one of the supporting pillars in SharePoint. And an extremely important one, for realizing the SharePoint proposition of an information hub plus collaboration workplace. It is essential that information you put into SharePoint, is easy to be found again.

    By yourself of course, but especially by your colleagues. However, from the context of ‘central information hub’, more is needed. You must also find and review via the SharePoint workplace the data that is administrated outside SharePoint. Examples are the business data stored in Lines-of-Business systems [SAP, Oracle, Microsoft Dynamics], but also data stored on network shares.
    With the purchase of FAST ESP, Microsoft’s search power of the SharePoint platform sharply increased. All analyst firms consider FAST, along with competitors Autonomy and Google Search Appliance as ‘best in class’ for enterprise search technology.
    For example, Gartner positioned FAST as leader in the Magic Quadrant for Enterprise Search, just above Autonomy. In SharePoint 2010 context FAST is introduced as a standalone extension to the Enterprise Edition, parallel to SharePoint Enterprise Search.
    In SharePoint 2013, Microsoft has simplified the architecture. FAST and Enterprise Search are merged, and FAST is integrated into the standard Enterprise edition and license.

    SharePoint FAST Search architecture

    The logical SharePoint FAST search architecture provides two main responsibilities:

    1. Build the search index administration: in bulk, automated index all data and information which you want to search later. Depending on environmental context, the data sources include SharePoint itself, administrative systems (SAP, Oracle, custom), file shares, …
    2. Execute Search Queries against the accumulated index-administration, and expose the search result to the user.

    In the indexation step, SharePoint FAST must thus retrieve the data from each of the linked systems. FAST Search supports this via the connector framework. There are standard connectors for (web)service invocation and for database queries. And it is supported to custom-build a .NET connector for other ways of unlocking external system, and then ‘plug-in’ this connector in the search indexation pipeline. Examples of such are connecting to SAP via RFC, or ‘quick-and-dirty’ integration access into an own internal build system.
    In this context of search (or better: find) in SAP data, SharePoint FAST supports the indexation process via Business Connectivity Services for connecting to the SAP business system from SharePoint environment and retrieve the business data. What still needs to be arranged is the runtime interoperability with the SAP landscape, authentication, authorization and monitoring.
    An option is to build these typical plumping aspects in a custom .NET connector. But this not an easy matter. And more significant, it is something that nowadays end-user organizations do no longer aim to do themselves, due the involved development and maintenance costs.
    An alternative is to apply Duet Enterprise for the plumbing aspects listed. Combined with SharePoint FAST, Duet Enterprise plays a role in 2 manners: (1) First upon content indexing, for the connectivity to the SAP system to retrieve the data.
    The SAP data is then available within the SharePoint environment (stored in the FAST index files). Search query execution next happens outside of (a link into) SAP. (2) Optional you’ll go from the SharePoint application back to SAP if the use case requires that more detail will be exposed per SAP entity selected from the search result.  An example is a situation where it is absolutely necessary to show the actual status. As with a product in warehouse, how many orders have been placed?

    Security trimmed: Applying the SAP permissions on the data

    Duet Enterprise retrieves data under the SAP account of the individual SharePoint user. This ensures that also from the SharePoint application you can only view those SAP data entities whereto you have the rights according the SAP authorization model. The retrieval of detail data is thus only allowed if you are in the SAP system itself allowed to see that data.

    Due the FAST architecture, matters are different with search query execution. I mentioned that the SAP data is then already brought into the SharePoint context, there is no runtime link necessary into SAP system to execute the query. Consequence is that the Duet Enterprise is in this context not by default applied.
    In many cases this is fine (for instance in the customer example described below), in other cases it is absolutely mandatory to respect also on moment of query execution the specific SAP permissions.
    The FAST search architecture provides support for this by enabling you to augment the indexed SAP data with the SAP autorisations as metadata.
    To do this, you extend the scope of the FAST indexing process with retrieval of SAP permissions per data entity. This meta information is used for compiling ACL lists per data entity. FAST query execution processes this ACL meta-information, and checks each item in the search result whether it allowed to expose to this SharePoint [SAP] user.
    This approach of assembling the ACL information is a static timestamp of the SAP authorizations at the time of executing the FAST indexing process. In case the SAP authorizations are dynamic, this is not sufficient.
    For such situation it is required that at the time of FAST query execution, it can dynamically retrieve the SAP authorizations that then apply. The FAST framework offers an option to achieve this. It does require custom code, but this is next plugged in the standard FAST processing pipeline.
    SharePoint FAST combined with Duet Enterprise so provides standard support and multiple options for implementing SAP security trimming. And in the typical cases the standard support is sufficient.

    lip_image002_2.png

    Applied in customer situation

    The above is not only theory, we actually applied it in real practice. The context was that of opening up of SAP Enterprise Learning functionality to operation by the employees from their familiar SharePoint-based intranet. One of the use cases is that the employee searches in the course catalog for a suitable training.

    This is a striking example of search-driven application. You want a classified list of available courses, through refinement zoom to relevant training, and per applied classification and refinement see how much trainings are available. And of course you also always want the ability to freely search in the complete texts of the courses.
    In the solution direction we make the SAP data via Duet Enterprise available for FAST indexation. Duet Enterprise here takes care of the connectivity, Single Sign-On, and the feed into SharePoint BCS. From there FAST takes over. Indexation of the exposed SAP data is done via the standard FAST index pipeline, searching and displaying the search results found via standard FAST query execution and display functionalities.
    In this application context, specific user authorization per SAP course elements does not apply. Every employee is allowed to find and review all training data. As result we could suffice with the standard application of FAST and Duet Enterprise, without the need for additional customization.

    Conclusion

    Microsoft SharePoint Enterprise Search and FAST both are a very powerful tool to make the SAP business data (and other Line of Business administrations) accessible. The rich feature set of FAST ESP thereby makes it possible to offer your employees an intuitive search-driven user experience to the SAP data.

    SharePoint Development roles urgently needs to be filled at MS Gold Partner – Contact me now for more information (Sorry, No recruiters, i am filling private positions)

    Senior SharePoint Developers needed urgently for MS Gold Partner in Sandton/Bryanston :

    3 – 5 years of development experience.

    2 year experience in SharePoint.

    3 years experience in C#.

    A minimum of 3 years experience in Visual Studio .NET 2005 – 2008.

    A minimum of 3 years experience in ASP.NET , HTML web development.

    A minimum of 3 years experience with Javascript.

    A minimum of 3 years experience with Windows XP, Windows 2003 and Windows Vista.

    A minimum of 3 years experience in relational database design and implementation with SQL Server
    Advantageous (nice-to-have):

    • Windows SharePoint Server.
    • Microsoft Office SharePoint Server.
    • BizTalk
    • Web Analytics
    • Microsoft CRM
    • K2

    Duet Enterprise and NetWeaver – Feautures of and Benefits for businesses by integrating SAP with SharePoint

    ImageImage

    For years organisations have been scratching their heads trying to figure out how to provide collaboration capabilities on data held within SAP systems.

    Many have developed custom solutions and achieved mixed results. We have also seen the rise of Duet 1.x from Microsoft and SAP that provided 11 specific solutions that surfaced data residing in SAP via Microsoft Office. These were good solutions but limited due to their lack of extensibility.

    Hence the excitement over Duet Enterprise and the benefits that have been realised are exactly what everyone has been looking for. InfoSys Technologies have just released a case study with Microsoft that discusses their Duet Enterprise project and it is an interesting read. Particularly the benefits realised:

    1. Minimal Development Effort
    2. Higher Adoption

    3. Enhanced Productivity

    How can these benefits be real?

    Well, to start with, Duet Enterprise provides all the relevant plumbing to blend both SAP and SharePoint “Platforms”. SAP and Microsoft position Duet Enterprise as the “Foundation” layer between the two platforms.

    DE Architecture

    Architecturally speaking the Duet Enterprise Add-on’s must be installed on SAP Netweaver 7.02 Servers and the SharePoint 2010 Farm. The SAP Netweaver 7.02 Servers work as the gateway to the SAP backend systems and can actually support any version of SAP system. The Duet Enterprise is built on SharePoint 2010 and makes use of the Business Connectivity Services that enables the full Create Read Update Delete (CRUD) operations to data residing on external systems. The options for user experience are the same for any SharePoint 2010 solution; Mobile, Office or a Browser. There are no SAP or Duet Enterprise clients.

    Ok, but what does this do?

    This provides organisations with a jointly supported, (Microsoft and SAP) technical approach and OOB tools to address the common challenges faced when integrating SAP systems. Such as;

    Authentication: Using Claims-Based authentication to ensure SSO is available between SharePoint 2010 sites and SAP backend systems.

    Authorisation: Able to secure objects in SharePoint using SAP Roles, supporting concepts such a Manager seeing more than an employee.

    Monitoring: Duet Enterprise SharePoint Health Rules are provided to proactively monitor your Duet Enterprise solutions. Also available are Duet Enterprise Management Pack for System Centre Operations Manager, a quick video here shows it in action.

    It is also worth taking a look at the Duet Enterprise Architecture for more information.

    These are just some of the behind the scenes aspects of Duet Enterprise that essentially provide a jump start in any SAP/SharePoint integration project. However, there is also another layer that exemplifies how data from SAP can be surfaced through SharePoint 2010 and Office 2010. (Office 2010 isn’t a pre-req but certainly a better experience!).

    Currently included are;

    1. Duet Enterprise Workflow
  • Duet Enterprise Profile

  • Duet Enterprise Collaboration

  • Duet Enterprise Sites

  • Duet Enterprise Reporting

  • All of this provide a way to jump start development of a solution that can be used OOB or easily extended to suit specific requirements. The result here is that organisations are able to surface data typically locked away in backend SAP systems to a much wider audience and in an inexpensive way. I may blog on what each of these are in the future and how to extend them.

    Once deployed, IT departments have a “Foundation” in place to support future extensibility. Once users start seeing what is possible I fully expect the flood gates to open with requests for composite applications.

    What else do you get?

    1. Long-term product roadmap commitment from SAP and Microsoft
    2. Platform for innovation: broad ecosystem of ISV and service partners offering Business Pack Solutions
    3. Partner solutions/apps certified by SAP

    Tip: Bypass WebProxy for BCS service application in Duet Enterprise landscape

    Setting up a fresh Duet Enterprise landscape, I was confronted with an issue trying to import BDC Models from the SAP Gateway system into SharePoint BCS:
    Application definition import failed. The following error occurred: Error loading url: “http://&#8230;.”. This normally happens when url does not point to a valid discovery document, or XSD schema.
    Using Fiddler I detected that the problem cause is a “(407) Proxy Authentication Required” issue: “The ISA server requires authorization to fulfill the request. Access to proxy filter is denied.” Although I did setup a rule in Windows CredentialsManager for automatic authentication against the web proxy, this is not picked up in the context of BCS service application as an autonomous running process. As it turns out, by default .NET web applications and services will attempt to use a proxy, even if it doesn’t need one.
    So how then to resolve from this situation? Multiple approaches are possible here:

    1. Explicitly set the Proxy Credentials for the BCS application process. It is not possible to set the proxy credentials direct in the web.config of 14hive\webservices\bdc. Instead you must use a 2-step delegation approach: refer in the web.config to a custom Proxy module implementation, and build the custom Proxy to explicitly set the proxy credentials:
      namespace ByPassProxyAuthentication
      {
          public class ByPassProxy : IWebProxy
          {
              public ICredentials Credentials
              {
                  get { 
                      return new NetworkCredential(
                          "username", "password", "domain"); }
                  set { }
              }
          }
      }
      
          <defaultProxy enabled="true" useDefaultCredentials="false">
    2. Disable usage of (default)proxy altogether for the BCS application process. This is a viable approach in case the consumed external systems are all within the internal company network infra.
      <system.net>  
        <defaultProxy  
          enabled="false"  
          useDefaultCredentials="false"/>  
        </system.net>
    3. Disable usage of (default)proxy for specific addresses for the BCS application process.
      <system.net>
          <defaultProxy>
              <bypasslist>
                  <add address="[a-z]+\.contoso\.com" />
                  <add address="192\.168\..*" />
                  <add address="Netbios name of server" />
              </bypasslist>
          </defaultProxy>
      </system.net>

      The first bypasses the proxy for all servers in the contoso.com domain; the second bypasses the proxy for all servers whose IP addresses begin with 192.168. The third bypass entry is for the ServerName

    4. Disable usage of proxy for specific address on system level. This is in fact the most simple approach, just disable proxy usage for certain url’s for all processes on system level. That is also the potential disadvantage, it can be that it is not allowed to disable proxy usage for all processes. You disable the proxy via IE \ Internet Options \ Connections \ LAN Settings \ Advanced \ Proxy Server \ Exception <Do not use proxy server for addresses beginning with>.

    Example of how to use the SAP NetWeaver Gateway in building a Cloud App

    Overview

    SAP provides a tool called SAP NetWeaver Gateway that enables the ability to expose SAP application data as an OData service. This OData service can then be used by a CBA to create custom line of business apps. SAP has several sample gateway services you can use for testing and app building. For our example, we will use the SAP Enterprise Procurement Model (EPM) service. Read the SAP documentation to learn how to access to the EPM service and other sample services from SAP. Be aware that these sample services are read-only; however, NetWeaver Gateway does support read-write services.

    Our SAP CBA app will be based on a fictional company that sells computers and accessories. This company has several locations worldwide, including a distribution branch that we will be building a line of business app for, named Contoso Shipping Management. Specifically, our app will help the branch manager of Contoso Shipping Management with their daily tasks. The branch manager routinely views product information in the system and adds supplemental production information that is specific to their branch (such as the item location and whether items are out of stock).

    Define the data model

    Begin by creating a CBA app; in Visual Studio. Choose the Cloud Business App project template under the Office/SharePoint>Apps node.

    Attach to SAP Data Source

    When you have created the app, attach it to the SAP service.

    1. In the Server Explorer, under the Server project, choose the Data Sources>Add Data Source.
    2. In the Attach Data Source Wizard, notice the option to select SAP as a data source; after selecting it, choose Next.
      clip_image001 Figure 1. Select SAP in the Attach Data Source Wizard
    3. On the Enter Connection Information page, enter the URL to SAP EPM service along with the credentials that you received after signing up for access to their test feeds; choose Next. Although, it is possible to select None for the authentication type, typically SAP feeds are configured to require authentication (CBA apps currently support connecting to SAP using basic authentication). For more information, see the Authentication section listed at the end of this post.
      clip_image003 Figure 2. Enter connection information in the Attach Data Source Wizard
    4. On the Choose your Entities page, select the BusinessPartner and Product entities and rename the data source to SAP_EPM_Service; choose Finish.
      clip_image005 Figure 3. Select the BusinessPartner and Product entities in the Attach Data Source Wizard

    As a result, you will now see the SAP_EPM_Service added as a data source to your Server project including the BusinessPartner and Product entities that you selected.

    clip_image007 Figure 4. Entity Designer showing the Product entity

    You can use the ctrl + up arrow\down arrow keys to change the order of the properties. It is useful to define the desired order on the entity, so that later when screens are created, the fields on the screen will automatically be added in the same order. For example, you may change the order of the properties so that ProductId, Name, ProductURL, and Description appear first.

    One feature of an SAP data source within a CBA is the recognition of certain SAP-specific annotations that can adorn entity properties within the service. Specifically, the annotations that will be recognized by a CBA are those that have the sap:semantics value set to “email”, “tel”, or “url”. The BusinessPartner entity that was selected in the Attach Data Source Wizard happens to have properties that demonstrate all three of these annotations. You can view the semantic annotations by viewing the $metadata from the SAP feed.

    https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV/$metadata

    clip_image008

    Viewing the BusinessPartner entity in the Entity Designer, observe that the EmailAddress, PhoneNumber, and WebAddress fields have their respective types set to the Email Address, Phone Number, and Web Address business types.

    clip_image010 Figure 5. Properties on the BusinessPartner entity have been set to the appropriate business type

    Extend the Product Entity Properties

    For our line of business app, we need to track some additional product information that is specific to our branch, Contoso Shipping Management. With a CBA, we can easily extend any entity properties by relating data from the internal database of the app with data from an external data source. Furthermore, CBAs support relating data between external data sources, such as SharePoint\Office 365 and SAP.

    Add a Relationship

    In our example, we would like to track two additional pieces of product information: the item location and whether a product is out of stock. First, we need to add an entity to the internal database of our app that will be used to store this additional information. Second, we will relate this entity to the Product entity (that exists in the SAP data source) using a one-to-one or zero-to-one relationship.

    1. To add an entity in the internal database, choose the Data Sources node in the Solution Explorer and select Add Table.
    2. Rename the table to ProductDetail and add the following properties: OutOfStock and BackroomLocation. Clear the Required check box for these properties.
      clip_image011 Figure 6. The ProductDetail entity
    3. Add a relationship between Product and ProductDetail. To do this, open Product in the entity designer and choose the Add: Relationship button.
      clip_image013 Figure 7. Adding a relationship
    4. In the Add New Relationship dialog box, add a relationship so that each Product can have one ProductDetail (and a ProductDetail must have a Product). Choose OK to close the dialog box.
      clip_image014 Figure 8. Configuring the relationship

    Create the client screens

    Now that the data model is defined, add some screens to the app. While working with the screens, remember that the sample SAP service that we are using is read-only. As a result, the only data that we can edit is the ProductDetail entity because it is stored in the internal database of the app. If instead we were using a read-write SAP service, we would be able to edit all of the information on these screens and automatically save the data back to SAP.

    Create the Common Screen Set

    1. Choose Screens node in the Solution Explorer and select Add Screen.
    2. In the Add New Screen dialog box, select the Common Screen Set template and set the Screen Data to SAP_EPM_Service.ProductCollection. Finally, choose OK to close the dialog box.
      clip_image016 Figure 9. Adding a new Common Screen Set

      As a result, you will now see a ProductCollection folder created in the SolutionExplorer that contains three screens: AddEditProduct, BrowseProductCollection, and ViewProduct.

    3. Since we defined a one-to-one or zero-to-one relationship between Product and ProductDetail, we need to add code that automatically creates a new ProductDetail entity when the AddEditProduct screen is opened. Choose the AddEditProduct screen from the Solution Explorer, choose the Write Code drop-down in the Screen Designer toolbar and choose created.
      clip_image018 Figure 10. Writing “created” code on the AddEditProduct screen

      To create a ProductDetail instance for this Product instance, use the following code.

      myapp.AddEditProduct.created = function (screen) {
        if (!screen.Product.ProductDetail) {
          var productDetail =        myapp.activeDataWorkspace.ApplicationData.ProductDetails.addNew();
          productDetail.Product = screen.Product;
        }
      };

      Now when the AddEditProduct screen is opened, a related ProductDetail is created, if one does not already exist.

    4. To make the fields that were added from the ProductDetail entity more prevalent on the screen, move the Out of Stock and Backroom Location controls to the top of the right Rows Layout group on this screen. Do the same on the ViewProduct screen as well. Notice that you can also change other appearance properties on controls, such as the font. There are many other properties that you can set on the screen designer to customize the screens.
    5. Also on the ViewProduct screen, drag out the Supplier field under our ProductDetail controls to show data from the BusinessPartner entity. This will create a group called Supplier (with properties from the related BusinessPartner entity). For this example, remove all the fields from this group except Email Address, Phone Number, and Web Address. Notice that these controls appear respectively as an Email Viewer, Phone Viewer, and Web Address Viewer (due to the annotations feature described earlier).
      clip_image020 Figure 11. Control layout
    6. Finally, we want to display the Product images on the ViewProduct screen. First change the Product Pic Url control to be an Image control. To do this, choose the ViewProduct screen in the Solution Explorer, then find the Product Pic Url control on the screen and change it from a Text control to an Image control.
      clip_image022 Figure 12. Changing Product Pic Url to an Image control

      Because this SAP service stores the image URLs in a relative format, we need to write more code to set the full URL to the image. With the Product Pic Url control still selected, choose the Edit PostRender Code link in the Properties window.

      clip_image024 Figure 13. Properties of the Product Pic Url control

      Add the following code to the PostRender method.

      myapp.ViewProduct.ProductPicUrl_postRender =  function (element, contentItem) {
        // add the URL of our SAP server to the relative ProductPicUrl
        var totalUri = "https://sapes1.sapdevcenter.com" + contentItem.value;
        $(element).find("img").attr("src", totalUri);
      };

    Run the app

    Now, run the app (press F5).

    If prompted, enter your SharePoint credentials. When the app starts, also choose Trust It (if prompted).

    Notice the following:

    • The app home screen allows you to browse Product data from the attached SAP data source.
    • When you choose a Product, the detailed Product information is displayed—this includes fields from both the attached SAP data source and the fields defined on the intrinsic ProductDetails entity.
      clip_image026 Figure 14. The ViewProduct screen
    • On the ViewProduct screen, you can choose the Edit button to open the AddEditProduct screen. Since the particular SAP service that the app is accessing is a read-only service, the fields defined on the SAP Product entity cannot be updated, but those defined on ProductDetail can be. If your app is accessing a read-only service, it is a good idea to remove the Add button on the ViewProduct screen and make the controls on the AddEditProduct screen “view” controls instead of “edit” controls.
      clip_image028 Figure 15. The AddEditProduct screen

    Additional notes

    Authentication

    SAP can be configured with a variety of authentication providers. For this release, we support HTTP Basic authentication. Basic authentication is enabled on most NetWeaver Gateway installations, and is easy to configure in both test and production. If you find that CBAs aren’t working with your SAP environment, let us know how we can support you better in the future.

    For our debut of SAP support, we wanted to enable a complete read and write scenario with SAP data. Therefore, in addition to basic authentication support, we’ve also implemented the session and CSRF token handling that SAP requires to be able to modify SAP data via the NetWeaver Gateway. This means that your CBAs will be able to write changes back to SAP if your SAP feeds support it. Don’t worry—when you attach to SAP data, we negotiate the SAP sessions and tokens automatically. There is nothing for you to configure.

    Non-addressable entities

    If your data fails to load and you see a diagnostics error similar to the following, it’s because you’re attempting to navigate directly to an SAP entity that is non-addressable.

    Error: The SalesOrderLineItemCollection is not addressable. Please use the Navigation Property via the SalesOrder Collection or Entity.

    For example, in the EPM service, the SalesOrderLineItemCollection entity set is marked as non-addressable which is evident in the service root document (for example, https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV).

    clip_image030

    This means that SalesOrderLineItemCollection is a child of SalesOrderCollection and that you can only access it by navigating to it through the parent.

    To solve this, be sure that you do not have a Browse Data Screen that is bound directly to a non-addressable entity (for example, SalesOrderLineItem). Instead, bind the Browse Data Screen to the parent (for example, SalesOrder) and create a View Details Screen that displays the SalesOrderLineItems data for the selected SalesOrder. This is done for you if you use the Common Screen Set template and select the parent as the primary entity for the Browse Data Screen.

    Why integrate SAP with SharePoint? Find out why!

    Mobile First…

    Probably everyone has heard of “Mobile first”. This concept is not only something that SAP is promoting (Going Mobile at SAP), but also partners, customer and analysts like Gartner  or Forbes are clearly supporting this strategy.

    The concept and idea are very clear: if you design applications to run on a mobile device with its small screen, then everything — the whole user interfacce — has to be simple and intuitive.

    One of the most prominent examples of this strategy is SAP Fiori. Based on SAPUI5 beautiful and intuitive applications are created and extremely well accepted by our customers.

     Image

    … does not mean mobile only

    In a customer presentation that I attended some time ago I also heard this feedback. SAP Fiori is an extremly powerful step by SAP to address workers across devices: whether they work on a mobile, tablet or desktop. However, the customer continued saying:

    “Mobile is sexy, but only 5% of our workers are using mobile devices (for their jobs). The majority of my users are still using a desktop”.

    I thought of this as a very interesting statement. Obviously I am also using a mobile device, but looking at a “Day in my life”, I start with using my mobile device, but then I use lots of other different technologies to interact with SAP data — and in a lot of cases these different technologies are powered somehow by Microsoft.

    “Don’t forget the desktop”

    DayInALife.png

    Still the data that I am accessing when working on my desktop is very often the same as the data which I need to access from my mobile device. So I would need to get SAP data in the tools that are running on my desktop.  And the desktop still is dominated in great numbers by Microsoft. Depending on which numbers you want to believe Windows is installed on 70% to 90% of desktops and Microsoft Office is running on over 1 billion devices world wide and is used in most organizations:

    OfficeUsage.png

    Having this in mind it is clear that “Mobile first does not mean mobile only

    Why don’t we take the concept and ideas — and actually the applications and scenarios — and make them available on the Microsoft desktop?

    Using SAP data from within Microsoft Office

    When you get a Purchase Order or Credit Memo and you are sitting in front of your Microsoft Outlook already writing an email — wouldn’t it be great to get this workflow directly as a Task in Outlook and be able to approve it from there?

    MicrosoftOutlook-Approval.png

    For the business partners, that you just worked with on your tablet in you SAP Fiori application  — wouldn’t it be beneficial to have the very same contacts in your Microsoft Outlook?

    MicrosoftOutlook-Contacts.png

    Similar with the tracked time that you just viewed in your SAP Fiori My Timesheet app on your mobile device — wouldn’t it be good to have these dates also in your Microsoft Outlook Calendar?

    GWPAM+Fiori.png

    94% of companies use Microsoft Office

    Similar to Microsoft Windows, Microsoft Office is still the dominant productivity suite at our customers. Not necessarily always the latest and greatest version, but definitely Microsoft Office. So in addition to having the power and flexibility of SAP Fiori an integration in Microsoft Office would absolutely make sense. We could not only offer an additional user interface by bringing SAP data to known and used Microsoft UIs, but also improve user productivity by keeping the end-user in the UI that they are currenlty working with.

    Going beyond these existing scenarios it can also make a lot of sense to empower the users to interact with data from SAP in a way they are already used to. When you think of mass data manipulation probably Microsoft Excel comes into your mind. So again this Microsoft based UI / tool could be the perfect choice for the end-users to work with data from SAP.

    Excel.png

    The challenges in integrating SAP to MSFT world

    All these integrations have one fundamental problem. You have to make sure that the integration, the interoperability between these known applications and data from SAP is easy, quick and secure. A recent study by IDC stated that more and more hobby developers build applications. Developing applications is getting easier and easier with impressive and powerful tools such as Visual Studio. However, developing and designing applications for life can be complicated. Gartner looked at the overall total cost of ownership of such applications. A very important factor to the total cost of ownership is the very first design of the application. How easily is the data consumed? How flexible is the application? How are security concerns like SSO, tracing, monitoring and scalability aspects handled?

    Keeping the hobby developers in mind and also surveys that show that only very few developers have security and “enterprise ready” know-how this is an issue.

    Nexus of forces

    The “nexus of forces” which Gartner started to highlight some time ago is a similar example of this. To quote:

    “The Nexus of Forces is the convergence and mutual reinforcement of social, mobility, cloud and information patterns that drive new business scenarios. Although these forces are innovative and disruptive on their own; together they are revolutionizing business and society, disrupting old business models and creating new leaders. The Nexus is the basis of the technology platform of the future.”

    from http://www.gartner.com/technology/research/nexus-of-forces/

    This technology platform needs a basis that not only addressed these forces, but also makes sure this is done in a secure and enterprise compliant / ready manner.

    SAP NetWeaver Gateway productivity accelerator for Microsoft!

    Luckily this is exactly what SAP NetWeaver Gateway productivity accelerator for Microsoft addresses with the interoperability framework between SAP and Microsoft. The Microsoft developer can stick to their knowledge. They can build applications for Microsoft Outlook, for Excel, for any .NET based application. They can be hobby developers from the business who “just” leverage templates and tools from Visual Studio to build beautiful applications — SAP NetWeaver Gateway productivity accelerator for Microsoft takes care of the interoperability and enterprise ready aspects.

    It gives IT the confidence and security that SAP NetWeaver Gateway productivity accelerator for Microsoft based applications adhere to certain design-patterns, leverage company wide security and supportability requirements. And help applications to be scalable from the very first POC for a few business users to the final roll-out to 10,000 users.

    With this framework the business is all of sudden empowered to respond to critical changes in the workplace at their own speed. They don’t have to rely on IT to help — because IT has already provided the required framework to make these changes happen.

    Optimizing the investments in Microsoft and SAP by providing a bridge that brings the two worlds together SAP NetWeaver Gateway productivity accelerator for Microsoft boosts workplace agility. This engagement initiative provides the business the flexibility they need to leverage existing skills, while still complying with corporate and IT regulations.

    How to successfully make the 2 worlds of SAP and SharePoint meet

    In essence, a ‘Duet Enterprise custom scenario’ development project is not different from other application development projects.

    The customer – either the real end-user, or product management for package software -, has an idea about what to achieve and functional + non-functional (quality) requirements to underline the idea.

    Often this idea is still vague, and the requirement set far from complete and on aspects also contradictionary.

    The waterfall approach to yet start with application development (specify, build, test) typically ends up with wrong or failed end-results: the application does not live up to the real customer needs and expectations, and the budget is largely overrun as consequence of responding on the requirements changes.

    The agile movement came up to improve on this bad practice. Central is the structural involvement of the user (representative) during the full course of the development project.

    Continuous, the user is in the lead about the specifics and the behaviour of the application functionality to be delivered.

    In Scrum, this responsibility is formalized within the product owner role.

    Application mockup
    A proven approach to enable the product owner to validate, correct, and ultimately confirm the application functionality is by make it experience, ‘touchable’.

    Provide the product owner with a realistic impression of the application behavour + feature(s) that the project will build: a clickable prototype / mockup.

    Based on multiple positive experiences (also within non-Duet Enterprise project, e.g. BI Dashboards), we strongly believe in the added value and power of application mockup. We regard the role of UX / UID expert crucial for customer involvement, and thus for delivering the correct application functionality.

    Meeting of 2 worlds

    Yet, a Duet Enterprise project also is very much different from ‘normal’ projects.
    The cause lies with the meeting of 2 very different environments: IT technology, platform concepts, and human nature.

    The tradional SAP environment is transaction oriented, with focus on structural business process executions.

    The Microsoft environment, and in particular SharePoint, is far more loose. Ad-hoc processes are the norm, the individual user decides how to perform a specific (business) action. When these 2 worlds + thinking come together in a Duet Enterprise project, the first outcomes are often dramatic.

    SAP versus SharePoint architects,
    business analysts and developers do not understand each other (concepts).
    Both ‘parties’ disagree on how and where (that is, on SAP or on SharePoint side) custom development should be done.

    The personal preference on how to achieve the SAP+SharePoint integration is influenced by one’s own technology frames and familiair concepts.

    This has the risk that the user is forgotten: the user interests must be central, deliver an optimal user experience.

    Align through Blueprint Duet Enterprise scenario’s

    To break through this, we utilize in our Duet Enterprise development process a blueprint step with the focus exclusive on the Duet Enterprise specific scenarios.

    This blueprint is derived and agreed by a mixed group of SAP and SharePoint representatives. The goal of this step, and its endresult, is to come to a mutual understanding of the solution direction for each of the Duet Enterprise scenario’s.

    Self-employed constraints are to stick as close to the standards of both the SAP business package and SharePoint platform. Where feasible, we do allow custom development.

    Here we apply Duet Enterprise integration patterns that we have identified and defined: Consumer-oriented, Provider-oriented, and XML-Funneling pattern.

    Of course, the discussion and preferences about where to make custom adjustments still manifest within the Duet Enterprise blueprinting step.

    However, as it now has the full focus of all involved, mutual agreement is earlier and more effective reached. This is the so-called ‘workshop effect’.

    Also, in this focused approach, comparable application situations are easier to recognize. And where so identified, the earlier agreed solution approach can be reused.

    Starting point for the blueprint derivation is the User Interface Design as validated and confirmed by the product owner.
    In the blueprint we derive and define per identified Duet Enterprise scenario the following pieces:

    Global description of the scenario: screenshot of the mockup screen and textual explanation. The purpose is to facilitate on high level a common understanding of the functionality in the scenario.

    There is no intend to compete with or replace use cases.
    Derivation, explanation and justification of solution direction. At minimal explanation of the proposed solution direction.

    But whenever applicable, also the rationale of why an alternative solution direction is dropped.

    SAP building blocks: standard RFC’s, BAPIs; identification of needed custom SAP building blocks

    
    Blueprint validation by Design Authority / Authorities

    Justified mantra nowadays within SAP and Microsoft IT departments is to avoid unnecessary or otherwise avoidable custom development. Instead stick close to the standard delivered functionality of both landscapes.

    It is also evident that fully excluding custom development is not feasible when you are developing company custom scenario’s.

    We facilitate the consistency and continuity in the Duet Enterprise solution directions by confirming the blueprint to Duet Enterprise Rules, Conventions and Guidelines.

    The Design Authorities of both SAP and Microsoft IT departments validate the blueprint against these agreements. Derivation can be allowed, but requires the formal approved of the respective Design Authority (SAP, Microsoft, and in some cases both).

    SAP + SharePoint realizations based on the blueprint

    With the blueprint accepted and delivered, from here on SAP and SharePoint teams in the project can start with their own development activities for the Duet Enterprise scenario’s.

    Since representatives of the teams were involved in deriving the blueprint, the concrete software design + development can be done relatively independent of the other party.

    Of course the data contract details of each SAP / Duet Enterprise interface are input for the SharePoint consumption.

    Regular sharing the SAP design with the SharePoint developers can achieve this.

    Building solutions on Duet Enterprise – What does this mean exactly?

    Any kind of enterprise application can be built on top of Duet Enterprise, e.g. to support particular business activities for Product Lifecycle Management (PLM), Supply Chain Management (SCM), Customer Relationship Management (CRM), … etc .

    However the idea here is not to rebuild application interfaces that already exist. Instead through Duet Enterprise we provide a set of application building blocks from which new kinds of solutions can be composed. We expect that these new kinds of solutions will enable collaboration, communications, or content management in Office and SharePoint in extension to the core SAP business processes.

    The consumers of these solutions would be information workers who need access to SAP information and services as they perform their business activities, or who participate in business processes managed in SAP, but are not typically power users of the SAP interfaces.

    duet5

    The building blocks that we will provide through Duet include both functional building blocks (e.g. application components for SAP data models like Customer, Product, Employee, … etc.) , as well as infrastructural building blocks (e.g. for Reporting, Workflow, Collaboration, … etc.).

    These infrastructural building blocks are what Christian described as ‘Ready To Use’ capabilities in his blog posting earlier.

    So an example of a solution built on Duet Enterprise might be an extension to inventory management processes in SAP to allow ad-hoc collaboration and reporting in SharePoint around alerts and notifications raised from a warehouse or a factory floor.

    However while Duet Enterprise provides a new set of building blocks, it does not add a new set of tools. Solution builders can leverage existing skillsets, and use standard tools like Visual Studio, SharePoint Designer, and ABAP Workbench. They would use these tools to compose these building blocks into new kinds of solutions that support a particular type of business activity.

    This solution would typically combine ad-hoc collaboration in SharePoint with structured process in SAP, and bring in contextual business data and reports to support decision making for that activity.

    Let us take a specific example. In today’s enterprises there are typically virtual teams to service key customer accounts. A particular account v-team would include employees who report into different organizational units, e.g. Sales, Marketing, Support … etc., and who work out of different geographic locations.  While organizationally separate, these v-teams need a way to collaborate among themselves, and to manage their working activities.

    Out-of-the-box with Duet Enterprise we will ship a set of composites, one of which is the customer collaboration workspace for account teams. This will enable any person with access to account information in SAP, to browse the list of customer accounts within SharePoint, and for any account request access to a collaboration workspace.

    This is a SharePoint site that gets auto-provisioned by the Duet Foundation the first time that the workspace is requested by a member of the account v-team, using the packaged composite that ships with Duet Enterprise. The site comes pre-wired with connections to SAP information and services.

    For example the account v-team members can view and edit account related information that is managed by SAP, as well as view reports related to the account, and can take for granted that the Duet Foundation provides the integrated security and administrative support which is necessary to support such activities.

    Now consider the design time experience to support the flow described above. The composite for the customer collaboration workspace which we provide in Duet Enterprise is a collection of building blocks that have been assembled together in a particular way. This provides a starting point for further customization and solution building.

    For example, there is a set of web services that are exposed from within the Duet Enterprise SAP Add-on (e.g. Customer, Sales Contact, … etc.), there is a set of External Content Types provided for SharePoint (that map to the SAP web services), and there is a  SharePoint site definition which provides a template for the customer collaboration workspaces which are auto-provisioned for v-team members.

    In addition there are SharePoint Features for the ready to use capabilities (e.g. Related Reports) that can be stapled to the Site Definition. So a solution builder can customize the composite by extending the web services and the external content types, by adding new SharePoint web parts or Features if needed, and by customizing the site definition files for the workspace, or even creating entirely new site definitions.

    Finally the solution builder leverages hooks in the Duet Foundation to plug in these new or customized composites.

    Duet Enterprise – Creating and Deploying a Mobile Adapter Class for Business Data Action Web Parts

    Learn how to create a mobile adapter class to display mobile views for Business Data Action Web Parts.

    In Microsoft SharePoint 2010, mobile views are available for both the Business Data Builder Web Part and the Business Data Item Web Part. In the Starter Services site of Duet Enterprise for Microsoft SharePoint and SAP, a mobile view is available for only the Business Data Item Web Part. You must define a mobile adapter class to make mobile views available for other Business Data Web Parts. This topic describes how to write a mobile adapter class for Business Data Action Web Parts.

    The procedures in this section describe how to create and deploy a mobile adapter class for displaying Business Data Action Web Parts.

    The following are the basic steps to create and deploy a mobile adapter class:

    1. Create a mobile adapter class for Business Data Action Web Parts.
    2. Edit the compat.browser file.
    3. Register your adapter as a safe control.

    To create a mobile adapter class for Business Data Action Web Parts

    1. In Microsoft Visual Studio 2010, create a new class library project named DuetMobileCustomization. Add references to the System.Web assembly and Microsoft.SharePoint.dll assembly.
    2. Add a using statement for the Microsoft.SharePoint.WebPartPages namespace. Depending on the details of your adapter implementation, add using statements for other namespaces. Commonly, mobile adapters make calls to types in the System.Web.UI.MobileControls namespace, Microsoft.SharePoint namespace, and Microsoft.SharePoint.MobileControls namespace.
    3. Add a class named WebPartClassMobileAdapter, where WebPartClass is a placeholder for the name of the Web Part that you are adapting. For example, if you are adapting the BusinessDataActionsWebPart, name the adapter class BusinessDataActionsWebPartMobileAdapter. This class should inherit from the WebPartMobileAdapter class.
    4. Add a namespace named MyCompany.SharePoint.WebPartPages.MobileAdapters. (Replace MyCompany with your company’s name.)
    5. Copy the following code into the new BusinessDataActionsWebPartMobileAdapter class.
      Copy
      using System;
      using System.Collections;
      using System.Collections.Generic;
      using System.Globalization;
      using System.Security.Permissions;
      using System.Web;
      using System.Web.Security;
      using System.Web.UI.MobileControls;
      
      using Microsoft.BusinessData.MetadataModel;
      using Microsoft.BusinessData.MetadataModel.Collections;
      using Microsoft.BusinessData.Runtime;
      
      using Microsoft.SharePoint.Portal.MobileControls;
      using Microsoft.SharePoint.MobileControls;
      using Microsoft.SharePoint.Utilities;
      using Microsoft.SharePoint.WebControls;
      using Microsoft.SharePoint.WebPartPages;
      using Microsoft.SharePoint.Portal.WebControls;
      using Microsoft.Office.Server.Diagnostics;
      
      namespace Microsoft.SharePoint.WebPartPages
      {
          public class BusinessDataActionsWebPartMobileAdapter : WebPartMobileAdapter
          {
    6. Because the WebPartMobileAdapter.Control property cannot be overridden, you might have to create a custom version of it by hiding and replacing it. You can do this by declaring a new Control property in your derived class by using the new keyword, as shown in the following example.
      Copy
      protected new BusinessDataActionsWebPart Control
              {
      [Microsoft.SharePoint.Security.SharePointPermission(System.Security.Permissions.SecurityAction.Demand, 
      ObjectModel = true)]
                  get { return base.Control as BusinessDataActionsWebPart; }
              }

      For more information about how to create a custom version of WebPartMobileAdapter.Control and hiding and replacing it, and why you might have to do this, see the Control property.

    7. If the default implementation of the CreateControlsForSummaryView method of the WebPartMobileAdapter class is not appropriate for mobile access to the Web Part in your specific implementation, override it. An override should create any necessary child controls and add them to the Controls collection in the order in which they should appear on a mobile device. The display should contain at least a small icon and a title for the summary view on a mobile device. If those are the only display elements that that you must have, you do not have to override the CreateControlsForSummaryView method. The WebPartMobileAdapter class contains two helper methods you can use to create your display: CreateWebPartIcon() and CreateWebPartLabel().When you must display more information than what appears in the default summary view (for example, when your adapted Web Part has multiple child items that are the same type), you can add a count of the total number of children to the summary view by placing a Label control after the icon and title. The following code shows how to do this.
      Note Note
      This example assumes that the custom Web Part that you are adapting has a Count property of type String that returns the total number of child items.
      Copy
      protected override void CreateControlsForSummaryView()
              {
                  this.CreateControlsForWebPartHeader();
                  this.CreateControlsForBusinessDataActions();
              }
      
              private void CreateControlsForWebPartHeader()
              {
                  Image iconImage = this.CreateWebPartIcon(WebPartIconLink.NoLink);
                  iconImage.BreakAfter = false;
                  this.Controls.Add(iconImage);
                  this.Controls.Add(this.CreateWebPartLabel());
              }
      
              private void CreateControlsForBusinessDataActions()
              {
                  try
                  {
                      IList<string> result = this.Control.SelectedActions;
                      try
                      {
                          if (this.Control.BdcEntity != null)
                          {
                              IList<IAction> displayActions = GetActionsToDisplay(this.Control.BdcEntity);
                              result = new List<string>();
      
                              foreach (IAction action in displayActions)
                              {
                                  Link l = new Link();
                                  l.Text = action.Name;
      // This example does not create the Parameter value. 
      // Add logic to set the action of the parameter before storing this value in the Link Navigation URL.
                                  l.NavigateUrl = action.Url;
                                  this.Controls.Add(l);
                              }
                          }
                      }
                      catch (MetadataException)
                      {
                          // Metadata error. Just default to returning the Web Part's selected Actions.
                          result = this.Control.SelectedActions;
                      }
                  }
                  catch (Exception e)
                  {
                      throw e;
                  }
              }
      private IList<IAction> GetActionsToDisplay(IEntity entity)
              {
                  IList<IAction> result = new List<IAction>();
                  INamedActionDictionary namedActionDictionary = entity.GetActions();
                  if (namedActionDictionary.Count != 0)
                  {
                      // First add all the currently selected actions.
                      foreach (string selectedActionName in this.Control.SelectedActions)
                      {
                          if (namedActionDictionary.ContainsKey(selectedActionName))
                          {
                              result.Add(namedActionDictionary[selectedActionName]);
                          }
                          // else
      
                      }
      
                      // Action may not be in the SelectedActions list,
                      // but the Web Part is configured to display new actions and
                      // this action is not one of the explicitly de-selected ones. Add it to UI.
                      if (this.Control.DisplayNewActions)
                      {
                          foreach (IAction action in namedActionDictionary.Values)
                          {
                              if (!result.Contains(action)
                                  && !this.Control.DeselectedActions.Contains(action.Name))
                              {
      
                                  result.Add(action);
                              }
                          }
                      }
                  }
      
                  return result;
              }
      
              public void cmd_Click(object sender, EventArgs e)
              {
                  string CommandText = ((Command)sender).Text;
      
              }
    8. If the default implementation of CreateControlsForDetailView is not appropriate for mobile access to the Web Part in your specific implementation, override it. The default implementation renders an icon and title followed by a message that states that there is no detailed view for the Web Part. If you have overridden the CreateControlsForSummaryView method and do not want to provide a detailed view, override CreateControlsForDetailView and have it do nothing, as shown in the following example.
      Copy
      protected override void CreateControlsForDetailView()
              {
                  // No Detail View
              }
    9. To change the icon that appears next to the Web Part title, override one or more of the following properties:
      • SummaryViewTitleIconUrl  The icon that appears next to the title when the Web Part is collapsed.
      • DetailViewTitleIconUrl  The icon that appears next to the title when the Web Part is expanded.
      • TitleIconUrl  The icon that appears next to the title when the mobile device does not support expand or collapse scripting.

      The code in the following example shows how to override the TitleIconUrl property. In this override, if the Web Part displays a list and the list has an icon of its own in its ImageUrl property, that icon is displayed.

      Copy
      protected override string TitleIconUrl
      {
          get
          { 
              SPContext context = SPContext.GetContext(HttpContext.Current);
      
              if (String.IsNullOrEmpty(context.List.ImageUrl))
              {
                  return base.TitleIconUrl;
              }
              return context.List.ImageUrl;
          }
      }
    10. Compile the assembly, give it a strong name, and then deploy it either to the global assembly cache or to the \BIN folder of the Web application on every front-end web server in the farm. To deploy it to the global assembly cache, ensure that GlobalAssemblyCache is selected in the Assembly Deployment Target of the Properties pane of your class library project in Visual Studio 2010. This topic assumes that you are deploying to the global assembly cache.

    To edit the compat.browser file

    1. In a text editor, open the compat.browser file that is located at \\Inetpub\wwwroot\wss\VirtualDirectories\port_number\App_Browsers\compat.browser, where port_number is the port of the web application. Scroll to the <browser> element that has the refID attribute value of default. This element will have a child element named <controlAdapters> that looks much like the code in the following example.
      Copy
      <controlAdapters>
        <adapter controlType="Microsoft.SharePoint.WebPartPages.XsltListViewWebPart, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
          adapterType="Microsoft.SharePoint.WebPartPages.XsltListViewWebPartMobileAdapter, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" />
        <adapter controlType="Microsoft.SharePoint.WebPartPages.ListViewWebPart, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
          adapterType="Microsoft.SharePoint.WebPartPages.ListViewWebPartMobileAdapter, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" />
        <adapter controlType="Microsoft.SharePoint.Applications.GroupBoard.WebPartPages.WhereaboutsWebPart, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
          <adapter controlType="Microsoft.SharePoint.Applications.GroupBoard.WebPartPages.WhereaboutsWebPart, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
                           adapterType="Microsoft.SharePoint.Applications.GroupBoard.WebPartPages.WhereaboutsWebPartMobileAdapter, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" />
                  <adapter controlType="Microsoft.SharePoint.WebPartPages.ImageWebPart, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
                           adapterType="Microsoft.SharePoint.WebPartPages.ImageWebPartMobileAdapter, Microsoft.SharePoint, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" />
      </controlAdapters>
    2. Add an <adapter> element as a child of the <controlAdapters> element. This child element maps your adapter class to the custom Web Part that it adapts. Notice that both the controlType attribute and adapterType attribute are required. The value for both should be the fully qualified name of the class and the four-part name of the assembly. To obtain your adapter assembly’s public key token, in Visual Studio 2010 on the Tools menu, click Get Assembly Public Key. For another way to obtain the public key token, see How to: Create a Tool to Get the Public Key of an Assembly (http://msdn.microsoft.com/en-us/library/ee539398.aspx). For more information about this XML markup, see Browser Definition File Schema (browsers Element) (http://msdn.microsoft.com/en-us/ms228122.aspx). The following code shows one example of an <adapter> element.
      Copy
      <adapter controlType="Microsoft.SharePoint.Portal.WebControls.BusinessDataActionsWebPart, 
      Microsoft.SharePoint.Portal, 
      Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" 
      adapterType="Microsoft.SharePoint.WebPartPages.BusinessDataActionsWebPartMobileAdapter, MobileCustomization, 
      Version=1.0.0.0, Culture=neutral, PublicKeyToken=<assemblyPublic Key>" />
      NoteNote
      To deploy your adapter class to a server farm, you must change the compat.browser file as described earlier on every front-end web server. Do not overwrite the existing compat.browser file with a compat.browser file of your own because this might cancel adapter mappings that are made by other Microsoft SharePoint 2010 solution providers. Consider deploying the adapter as part of a SharePoint 2010 Feature. In the FeatureActivated event handler, create a timer job that adds the required <adapter> element to the compat.browser file on every front-end web server. For detailed information about programmatically editing the compat.browser file on all servers by using a timer job, see How to: Run Code on All Web Servers (http://msdn.microsoft.com/en-us/library/ff464297.aspx).

    To register your adapter as a safe control

    1. In your Visual Studio 2010 project, add an XML file named webconfig.CompanyName.xml, where CompanyName is the name of your company or another name that is not likely to be used by any other SharePoint Foundation 2010 solution providers.
      Tip Tip
      We recommend that you register your adapter by deploying it inside a SharePoint 2010 solution. The steps in this section are required only if your development computer is a single front-end web server. A SharePoint 2010 solution enables you to register controls as safe on all front-end web servers when your solution is deployed. For more information about using solution deployment to register controls as safe, see Solutions Overview (http://msdn.microsoft.com/en-us/library/aa543214.aspx), Manually Creating Solutions in SharePoint Foundation (http://msdn.microsoft.com/en-us/library/aa543741.aspx), and Solution Schema (http://msdn.microsoft.com/en-us/library/ms442108.aspx).
    2. Add an <action> element that follows the model in the example below to the file. The TypeName attribute of the <SafeControl> element can be the name of your adapter class, such as UserTasksWebPartMobileAdapter. If you have multiple adapter classes in the same namespace, you can use an asterisk (*) as the value of TypeName.
      Copy
      <action>
         <add path="configuration/SharePoint/SafeControls">
          <SafeControl
            Assembly=" MobileCustomization, Version=1.0.0.0, Culture=neutral, PublicKeyToken=<myPublicKeyToken>"
            Namespace="Microsoft.SharePoint.WebPartPages"
            TypeName="*"
            Safe="True"
            AllowRemoteDesigner="True"
          />
        </add>
      </action>
      Caution noteCaution
      Using an asterisk (*) as the value of TypeName makes every class in the namespace a safe control. If you have some classes in the assembly that should not be designated as safe, move them to a different assembly or avoid using the asterisk (*) value.

      For more information about the <SafeControl> element and web.config files, see How to: Create a Supplemental .config File (http://msdn.microsoft.com/en-us/library/ms439965.aspx) and Working with Web.config Files (http://msdn.microsoft.com/en-us/library/ms460914.aspx).

    3. Save the file. You must now copy it to the %ProgramFiles%\Common Files\Microsoft Shared\web server extensions\14\CONFIG folder on your development computer. The simplest way to do this on your development computer is to add the following lines to a post-build event command line or to a batch file script.
      Copy
      xcopy /y webconfig.MyCompany.xml "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG"
      stsadm –o copyappbincontent
      NoteNote
      This code assumes that you have followed the recommendations in How to: Add Tool Locations to the PATH Environment Variable (http://msdn.microsoft.com/en-us/library/ee537574.aspx).

      The copyappbincontent Stsadm.exe command performs the action defined by the <action> element in your Web configuration .xml file. In this case, it inserts the new <SafeControl> element of your adapter into the web.config file at the root of the web application. It first removes any existing <SafeControl> elements for the adapter. This lets you rerun the Stsadm command with every build without creating duplicate <SafeControl> elements.) For more information about Stsadm, see Stsadm command-line tool (http://msdn.microsoft.com/en-us/library/cc288981(office.12).aspx).

     

    Enhanced by Zemanta

    Access SAP Business Data From Silverlight 4 Clients Using WCF RIA Services And LINQ to SAP

    The introduction of Microsoft’s WCF RIA Services for Silverlight 4 simplified very much the development process of N-tier business applications using Silverlight and ASP.NET. By using this new technology, we can also easily access and integrate SAP business data in Silverlight clients.

    This article shows how to provide a SAP domain service as web service that will be consumed by a Silverlight client. The sample application will allow the user to query customer data. The service uses LINQ to SAP from Theobald Software to connect to a SAP R/3 system.

    Project Setup

    The first step in setting up a new Silverlight 4 project with WCF RIA Services is to create a solution using the Visual Studio template Silverlight Navigation Application:

    Screenshot-01.png - Click to enlarge image

    Visual Studio 2010 then asks you to create an additional web application, which hosts the Silverlight application. It’s important to select the checkbox Enable WCF RIA Services (see screenshot below):

    SAP2Silverlight/Screenshot-02.png

    After clicking the Ok button, Visual Studio generates a solution with two projects, one Silverlight 4 project and one ASP.NET project. In the next section, we will create the SAP data access layer using the LINQ to SAP designer.

    LINQ to SAP

    The LINQ to SAP provider and its Visual Studio 2010 designer offers a very handy way to design SAP interfaces visually. The designer will generate the code for the SAP data access layer automatically, similar to LINQ to SQL. The LINQ provider is part of the .NET library ERPConnect.net from Theobald Software. The company offers a demo version for download on its homepage.

    The next step is to create the needed LINQ to SAP file by opening the Add New Item dialog:

    Screenshot-03.png - Click to enlarge image

    LINQ to SAP is internally called LINQ to ERP.

    Clicking the Add button will create a new ERP file and opens the LINQ designer. Now, drag the Function object from the toolbox and drop it onto the designer surface. If you have not entered the SAP connection data so far, you are now asked to do so:

    Screenshot-04.png - Click to enlarge image

    Enter the connection data for your SAP R/3 system and then click the Ok button. Next, search for and select the SAP function module named SD_RFC_CUSTOMER_GET. The function module provides a list of customer data.

    The RFC Function modules dialog opens and lets you define the necessary parameters:

    SAP2Silverlight/Screenshot-05.png

    In the above function dialog, change the method name to GetCustomers and mark the Pass checkbox for the NAME1 parameter in the Exports tab. Also set the variable name to namePattern. On the Tables tab, mark the Return checkbox for the table parameter CUSTOMER_T and set the table and structure name to CustomerTable and CustomerRow:

    SAP2Silverlight/Screenshot-06.png

    After clicking the Ok button and saving the ERP file, the LINQ designer will generate a SAPContext class which contains a method called GetCustomers with an input parameter named namePattern. This method executes a search for SAP customer data allowing the user to enter a wildcard pattern. The method returns a table of customer data:

    SAP2Silverlight/Screenshot-07.png

    On the LINQ designer level (click on the free part of the LINQ designer surface) property, Create Object Outside Of Context Class must be set to True:

    Screenshot-08.png - Click to enlarge image

    Now, we finally add a Customer class which we use in our SAP domain service later on. This class and its values will be transmitted to the Silverlight client by the WCF RIA Services. It’s important to set the Key attribute on the identifier fields for WCF RIA Services, otherwise the project will not compile:

    Screenshot-09.png - Click to enlarge image

    That’s it! We now have our SAP data access layer ready to use and can start adding the domain service in the next section.

    SAP Domain Service

    The next step is to add the SAP domain service to our web project. A domain service is a specialized WCF service and is one of the core constructs of WCF RIA Services. The service exposes operations that can be called from the client generated code. On the client side, we use the domain context to access the domain service on the server side.

    Add a new Domain Service Class and name it SAPService:

    Screenshot-10.png - Click to enlarge image

    In the upcoming dialog, create an empty domain service class by just clicking the Ok button:

    SAP2Silverlight/Screenshot-11.png

    Next, we add the service operation GetCustomers to the SAP service with a name pattern parameter. The operation then returns a list of Customer objects. The Query attribute limits the result set to 200 entries.

    The operation uses the visually designed SAP data access logic to retrieve the SAP customer data. First of all, an instance of the SAPContext class will be created using a connection string (see sample in code). For more details regarding the SAP connection string, see the ERPConnect.net manual.

    The LINQ to SAP context class contains the GetCustomers method which we will call using the given namePattern parameter. Next, the operation creates an instance of the Customer class for each customer record returned by SAP.

    The license code for the ERPConnect.net library is set in the constructor of our domain service class.

    Screenshot-12.png - Click to enlarge image

    That’s all we need on the server side.

    In the next section, we will implement the Silverlight client.

    Silverlight Client

    The implementation of the client side is straightforward. The home view contains a DataGrid control to display the list of customer data as well as a search area with TextBox and Button controls to allow users to enter name search pattern.

    The click event handler of the load button, called OnLoadButtonClick, will execute the SAP service. The boilerplate code to access the web service was generated by WCF RIA Services in the subfolder Generated_Code in the Silverlight project.

    First of all, an instance of the SAPContext will be created. Then, we load the query GetCustomersQuery and execute the service operation on the server side using WCF RIA Services. If the domain service returns an error, the callback anonymous method will mark the error as handled and display the error message.

    If the execution of the service operation succeeded, the result set gets displayed in the DataGrid control.

    Screenshot-13.png - Click to enlarge image

    The next screenshot shows the final result:

    Screenshot-14.png - Click to enlarge image

    That’s it.

    Summary

    <

    p>This article has shown how easily SAP customer data can be integrated within Silverlight clients using tools like WCF RIA Services and LINQ to SAP. It is quite simple to extend the SAP service to integrate all kinds of operations.

    How to : Configure a Customer or Product workspace for offline availability using Duet Enterprise and SAP

    In the Duet Enterprise Web site, an entity workspace contains one or more lists with data from an SAP backend. For example, the Products workspace has a list of products, or the Customers workspace has a list of customers, contacts, inquiries and quotations list.

    Working with an entity workspace offline gives you more flexibility to use the workspace data. For example, if you configure the Customer workspace for offline availability, you can consolidate your business contacts from SAP with your other Outlook contacts in Outlook 2010. You can then use Outlook 2010 more effectively for corresponding with business contacts and customers.

    To make a workspace available offline, first create a downloadable package (called a BCS solution) that contains several important elements, such as lists of workspace data, task panes, and actions.

    Lists In the Customers workspace, two lists are available for offline use—Customers and Contacts.

    Task pane A workspace task pane is a window that provides information about another workspace element, or about options to perform a task. For example, a task pane can show details of a contact or a product.

    Actions A task (such as Open Workspace) that you are able to perform related to a workspace.

    In Outlook 2010, task panes and actions enable an interactive experience of connecting and working with the information in a workspace. By default, Duet offers a few pre-defined task panes and actions. However, if the default ones do not meet your needs, you can add customized task panes and actions. Customized task panes and actions add functionality to the offline experience in Outlook 2010. For example, you can create a task pane that lists the quotations and inquiries related to a customer, or an action to open an email template or reports for a particular contact, and so on.

    Note The Connect to Outlook option in a SharePoint 2010 site only enables you to take the list data in a workspace offline, and not task panes and actions.
    Step 1: Enable libraries and upload building block files to the site collection

    Building block files are source files for the task panes, actions, and other core files that you need to package together to enable the workspace to function offline in Outlook 2010. First, you have to upload these source files to the site collection. Later in the process, you create a package of building block files to download to Outlook 2010.

    To configure a workspace for offline use, three types of building block files are important.

    A BCS Solution Task Pane file, which is an Extensible Markup Language (XML) file. You can use the task pane file that duet provides, or use a custom file.
    A Business Data Action file, which is also an XML file. You can use the default action file or use a custom file.
    The Dynamic Linked Library (DLL) file that implements the task pane and action.

    Note If you are unsure of the location or the name of the file, contact your server administrator.
    Enable libraries on the site collection

    First, you have to enable the libraries on the site collection that stores these building block files. The following steps describe how to enable libraries:

    From the Site Actions menu, click Site Settings.
    Under Site Collection Administration, click Site collection features.

    Note If the Site collection features option does not appear, then click Go to top level settings option under Site Collection Administration. The Site Collection Features option will now appear.

    On the Features page, against BCS Solution Galleries, click Activate.

    You are now ready to upload the source files to the site collection starting with the DLL file.
    Upload the building block files to the site collection

    Click Site Collection Administration on the breadcrumb navigation at the top of the page.
    On the Site Settings page, under BCS Solution Galleries, click Application Assemblies.
    On the All Documents page, click Add document. Then in the Upload Document window, click Browse to navigate to the DLL file, and then click OK.
    On the Site Actions menu, click Site Settings.
    On the Site Settings page, under BCS Solution Galleries, click Task panes.
    On the All Documents page, click Add document. Then in the Upload Document window, click Browse to navigate to the task pane file (for example, ContactDetails.xml), and then click OK.
    On the Site Actions menu, click Site Settings.
    On the Site Settings page, under BCS Solution Galleries, click Business data actions.
    On the All Documents page, click Add document. Then in the Upload Document window, click Browse to navigate to the action file (for example, CollabOnAction.xml), and then click OK.

    Step 2: Build the BCS solution

    A BCS solution is a package of downloadable files that enable specific functionality. In this case, the BCS solution contains a collection of files that together enable elements of a Duet workspace to function offline. It includes workspace lists, BCS Solution Task Panes, and BCS Data Actions. When you create a BCS Solution, the resulting objects are called BCS Solution Artifacts.

    To build a BCS solution, you have to follow these steps:

    Configure a setting that will make the workspace available offline in Outlook 2010.
    Add the BCS Solution Task Pane files for the workspace.
    Add the Business Data Action files for the workspace.
    Generate the BCS Solution Artifacts.

    Note If you are unsure of what files to use or where they are located, contact your server administrator.
    Configure a workspace for offline availability

    Open the Customers collaboration space.
    From the Site Actions menu, click Site Settings.
    On the Site Settings page, under Site Actions, click Manage site features.
    On the Features page, locate the BCS Solution Design Feature and click Activate.
    Click Site Settings on the breadcrumb navigation at the top of the page.
    On the Site Settings page, under Duet Enterprise Administration, click Outlook Application Designer.
    On the Outlook Application Designer page, under External Lists, click Contacts.
    On the Outlook settings page, under List Settings, click Outlook Client Settings.
    In the Edit Outlook settings for this external list dialog box, select the Offline this external list to Outlook and Auto generate Outlook forms check boxes and click OK.
    (Optional) Repeat steps 7 to 9 for configuring another workspace.

    You are ready to upload the task pane files for the workspace.
    Upload the Business Data Task Pane files for the workspace

    Note If you are unsure of what files to use or where they are located, contact your server administrator.

    On the Outlook settings page, under Task Panes, click Add from Business Data Task Panes Gallery.
    In the Add task pane page, do the following:

    In Select a task pane, select a task pane from the available task panes.
    In Display properties, enter a display name and a tool tip for the task pane. A tool tip is the text that appears when you hover over the task pane in Outlook 2010.
    In Default task pane, select Make this the default task pane if you want to make the selected task pane as the default task pane.
    Click OK.
    (Optional) Repeat the previous steps to upload additional task panes.

    Note If you do not see any task panes listed, it might mean that your site administrator has not uploaded the task pane files to the site collection. Contact your site administrator.

    You are now ready to upload the action files for the workspace.
    Upload the Business Data Action files for the workspace

    On the Outlook settings page, under Business Data actions, click Add from Business Data Actions Gallery.
    On the Add business data actions page, do the following:

    In Select a business data action, select an action from the available actions.

    Note If you do not see any actions listed, it might mean that your site administrator has not uploaded the action files to the site collection. Contact your site administrator.

    In Display properties, enter a display name Open Customer Workspace and a tool tip for the action. The tool tip text appears when you hover over the task pane in Outlook 2010.
    Click OK.
    In Map business data action parameters, select a field from the dropdown menu of entity fields.
    Click OK.
    (Optional) Repeat steps 25 and 26 for adding more action files.

    Generate the BCS Solution Artifacts

    The next step is to generate the BCS Solution Artifacts. A BCS Solution Artifact is a software object that collects elements of the solution into a logical unit that the system can process.

    On the Outlook settings page, click Back to the Outlook Application Designer page.
    Click Generate BCS Solution Artifacts.
    Click OK when you get a message that the operation is successfully completed.

    Step 3: Generate the BCS Solution and open in Outlook 2010

    Once you have generated the BCS Solution Artifact, you or other users with the necessary permissions can generate the BCS Solution and download the solution in Outlook 2010.

    On the Site Actions menu, click Generate BCS Solution.
    On the Generate BCS Solution page, in Certificate, select a certificate, and then click OK.

    Note If the OK button is not available, it means that the farm administrator has not uploaded the necessary certificates to the Trusted Publishers and Trusted Root Certification Authorities stores in the farm. Contact your server administrator for further assistance.

    Click OK in the Web page to confirm when the operation completes successfully.

    Duet Enterprise opens the All Documents page, where you can download the solution in Outlook 2010. In the All Documents page, you can even manage permissions for the solution. This means you can add users who can download the solution in Outlook 2010, or even remove users.
    Download the solution in Outlook 2010

    On the Site Actions menu, select Download BCS Solution. Click Allow.
    On the Microsoft Office Customization Installer dialog box, click Close.

    The Contacts list now opens automatically in Outlook 2010. In Outlook 2010, you can view and update contact details, email contacts, or open a customer workspace corresponding to a contact.

    Follow these steps to work with contacts:

    In Outlook 2010, double-click a contact in the contacts list.
    On the contact details window, click the Duet Enterprise tab, and click View SAP actions. The View SAP actions option displays the task pane and action that we added in the earlier steps.

    View SAP actions

    Do one of the following:

    Click Show Details Task Pane to view the contact details task pane. The task pane displays the contact’s full name and the title. By default, the task pane is docked to the contact details window, and this position might prevent you from viewing the full details. You can undock it by clicking the dropdown appearing next to Customers, and then clicking Move.

    Customers task pane in Outlook 2010

    Customers task pane in Outlook 2010

    Click Open Customer Workspace to open the customer workspace on the Duet Enterprise Web site for the selected contact.

    Each time you open Outlook 2010, a synchronization window opens automatically to show that Outlook 2010 is synchronizing with the workspace on the Duet Enterprise Web site.

    Dynamics CRM, SharePoint, and Office – A Great Combo (Channel 9 Video)

    Office, SharePoint and CRM

    With the first release of the OBA Sample Application Kit for SAP, http://archive.msdn.microsoft.com/obasapsak, you learned the concepts around OBAs. These powerful composite applications bring Line-of-Business (LOB) data (e.g. SAP, PeopleSoft, or Dynamics) to the end-user’s fingertips within the context of the Office applications they use and know.

    Version 1.0 of the Sample Application Kit for SAP was built on Microsoft Office SharePoint Server 2007 and Microsoft Office 2007. Version 2.0 is built on SharePoint 2010 and Office 2010 using Visual Studio 2010.

    The sample application is a travel package booking application designed to allow users to work within their familiar Microsoft Office environment to access and update some LOB data in SAP.

    The sample application is composed of an Excel 2010 add-in, and on SharePoint 2010, a Silverlight user experience is used for interacting with LOB data via Business Connectivity Services (BCS).Each part of the sample application is developed, at least partially, with Visual Studio 2010.

    The SharePoint portion, of course, uses SharePoint specific tools.The OBA Sample Application Kit for SAP v. 2.0 accesses data from both SAP via Web services and from SQL Server.

    These OBA resources were recommended reading for v. 1.0 and still provide an excellent conceptual foundation for v. 2.0:

    1. Six Office Business Applications (MS Press) or Programming Microsoft Office Business Applications (MS Press).
    2. Overview of an OBA: http://msdn2.microsoft.com/en-us/library/bb614538.aspx 

    3. Overview of OBA Solution Patterns: http://msdn2.microsoft.com/en-us/library/bb614541.aspx 

    So, with the above in hand what is this OBA Sample Application Kit for SAP v. 2.0?

    In many cases, developers don’t have the knowledge on how to programmatically integrate Office applications with LOB systems.

    What this kit provides is guidance on how you can integrate with Web services that have been generated from within SAP and then consume those services within a .NET and managed code environment.

    The OBA Sample Application Kit for SAP v. 2.0 (one in a series of kits) includes a deep dive technical document, istallation document, demo walkthrough document of the end-user experience and source code.

    The sample application kit is composed of an Excel 2010 add-in and a SharePoint 2010 site.

    The Excel 2010 add-in facilitates the process of purchasing packages, maintaining packages and associated events and booking flights.

    The application is also used to dynamically generate PowerPoint 2010 presentations using the Open XML SDK.

    These PowerPoint 2010 presentations are shown to potential customers and display marketing information designed to entice customers to purchase packages.

    The SharePoint 2010 site is a web site that enables the browsing of packages, events, flights and customer data.

    The goal of the kit is to provide developers with information on how they can learn to programmatically integrate SAP with Office and SharePoint using Visual Studio 2010.

    So click on the Downloads tab where you’ll find links to the documents and source code.

    Please note that the download items are unsupported and are intended only for instructional use.

    Additional Office and SharePoint Resources:

    1. Office Developer Training Course on MSDN: http://msdn.microsoft.com/en-us/Office2010DeveloperTrainingCourse .

    1. SharePoint DeveloperTraining Course on MSDN:http://msdn.microsoft.com/en-us/SP2010DevTrainingCourse 

    3. SharePoint Sideshow for Developers on Channel 9:http://channel9.msdn.com/Shows/SharePointSideshow 

    4. Video instructions for how to get and setup a turn-key VHD for Office and SharePoint development and demo: http://bit.ly/hSd8nJ .

    5. Office Developer Center on MSDN: http://msdn.microsoft.com/office 

    I hope you enjoy the v. 2.0 kit, and if you’re looking for something on PeopleSoft integration, the OBA Sample Application Kit for PeopleSoft v. 2.0 is here, http://bit.ly/obapsftsak20 .