Tag Archives: Cloud

How To : Implement Business Data Connectivity in SharePoint 2013

Business Data Connectivity

Business Connectivity Services is a centralized infrastructure in SharePoint 2013 and Office 2013 that supports integrated data solutions. With Business Connectivity Services, you can use SharePoint 2013 and Office 2013 clients as interfaces into data that doesn’t live in SharePoint 2013 itself. For example, this external data may be in a database and it is accessed by using the out-of-the-box Business Connectivity Services connector for that database.

DuetEnterpriseDesign[1]

Business Connectivity Services can also connect to data that is available through a web service, or data that is published as an OData source or many other types of external data. Business Connectivity Services does this through out-of-the box or custom connectors.

External Content Types in BCS

External content types are the core of BCS. They enable you to manage and reuse the metadata and behaviors of a business entity, such as Customer or Order, from a central location. They enable users to interact with that external data and process it in a more meaningful way.

For more information about using external content types in BCS, see External content types in SharePoint 2013.

How to Connect With SQL External Data Source

Open the SharePoint Designer 2013 and click on the open site icon:

Input the site URL which we need to open:

Enter your site credentials here:

Now we need to create the new external content type and here we have the options for changing the name of the content type and creating the connection for external data source:

And click on the hyperlink text “Click here to discover the external data source operations, now this window will open:

Click on the “Add Connection “button, we can create a new connection. Here we have the different options to select .NET Type, SQL Server, WCF Service.

Here we selected SQL server, now we need to provide the Server credentials:

Now, we can see all the tables and views from the database.

In this screen, we have the options for creating different types of operations against the database:

Click on the next button:

Parameters Configurations:

Options for Filter parameters Configuration:

Here we need to add new External List, Click on the “External List”:

Select the Site here and click ok button:

Enter the list name here and click ok button:

After that, refresh the SharePoint site, we can see the external list here and click on the list:

Here we have the error message “Access denied by Business Connectivity.”

Solution for this Error

SharePoint central admin, click on the Manage service application:

Click on the Business Data Connectivity Service:

Set the permission for this list:

Click ok after setting the permissions:

After that, refresh the site and hope this will work… but again, it has a problem. The error message like Login failed for user “NT AUTHORITY\ANONYMOUS LOGON”.

Solution for this Error

We need to edit the connection properties, the Authentication mode selects the value ‘BDC Identity’.

Then follow the below mentioned steps.

Open PowerShell and type the following lines:

$bdc = Get-SPServiceApplication | 
where {$_ -match “Business Data Connectivity Service”}
$bdc.RevertToSelfAllowed = $true
$bdc.Update();

Now it’s working fine.

And there is a chance for one more error like:

Database Connector has throttled the response.
The response from database contains more than '2000' rows. 
The maximum number of rows that can be read through Database Connector is '2000'. 
The limit can be changed via the 'Set-SPBusinessDataCatalogThrottleConfig' cmdlet

It’s because it depends on the number of recodes that exist in the table.

Solution for this Error

Follow the below steps:

Open PowerShell and type the following lines and execute:

$bcs = Get-SPServiceApplicationProxy | where{$_.GetType().FullName 
-eq (‘Microsoft.SharePoint.BusinessData.SharedService.’ + ‘BdcServiceApplicationProxy’)}
$BCSThrottle = Get-SPBusinessDataCatalogThrottleConfig -Scope database 
-ThrottleType items -ServiceApplicationProxy $bcs
Set-SPBusinessDataCatalogThrottleConfig -Identity $BCSThrottle -Maximum 1000000 -Default 20000

SharePoint Online: Software Boundaries, Limits and Planning Guide

This article describes some important limitations that you might need to know for different SharePoint Online plans in Office 365.
For example, it provides information about number of supported users, storage quotas, and file-size limits. This article covers a range of plans:
SharePoint Online in Office 365 Small Business and in Office 365 Enterprise, plus standalone plans.
The limits that are listed are for paid subscriptions. You might see different limits for trial plans andSharePoint Online preview sites. 

Note    In Office 365 plans, software boundaries and limits for SharePoint Online are managed separately from mailbox storage limits. Mailbox storage limits are set up and managed by using Exchange Online. For more information about how Exchange manages mailbox limits, see Mailbox types and storage limits for Recipients.

In this article

SharePointOnline2L-1[1]

 

SharePoint Online Feature availability

Need help determining which SharePoint solution best fits your organization’s needs?

The various Office 365 plans include different SharePoint Online offerings. These include:

  • SharePoint Online for Office 365 Small Business
  • SharePoint Online for Office 365 Midsize Business
  • SharePoint Online for Office 365 Enterprise, Education, and Government

You can choose the plan that best fits your organization’s needs. Each person who accesses the SharePoint Online service must be assigned to a subscription plan. SharePoint Online can be included in a Microsoft Office 365 plan, or it can be purchased as a standalone plan, such as SharePoint Enterprise Plan 1 or SharePoint Enterprise Plan 2.

Limits in SharePoint Online in Office 365 plans

In this section:

Limits for SharePoint Online for Office 365 Small Business

SharePoint Online Small Business and SharePoint Online Small Business Premium have common boundaries and limits. The following table describes those limits.

Feature Description
Storage per user (contributes to total storage base of tenant) 500 megabytes (MB) per subscribed user.
Site collection quota limit Up to 1 TB per site collection. (25 GB for a trial).

5,000 items in site libraries, including files and folders.

The minimum storage allocation per site collection is 100 MB.

Site collections (#) per tenant 1 site collection per tenant.
Subsites Up to 2,000 subsites per site collection
Total available tenant storage 10 GB + 500 MB per user.

For example, if you have 10 users, the base storage allocation is 15 GB (10 GB + 500 MB * 10 users).

You can purchase additional storage up to a maximum of 1TB.

Personal site storage 1 TB per user, as soon as provisioned.

This amount is counted separately, and does not add to or subtract from the overall storage allocation for a tenant. Personal site storage applies to a user’s OneDrive for Business library and personal newsfeed. For more information, see Additional information about OneDrive for Business limits.

Public Website storage default 5 GB

A SharePoint admin can allocate up to 1 TB (the limit for a site collection).

File upload limit 2 GB per file.
File attachment size limit 250 MB
Sync limits 20,000 items in the OneDrive for Business library, including files and folders.

5,000 items in site libraries, including files and folders.

Number of users 1 – 25 users
Number of external users invitees There is no limit to number of external users you can invite to your SharePoint Online site Collections. For more information, see Manage external sharing for your SharePoint Online environment

When reviewing the information on the previous table, remember that the base storage limits for Office 365 for Small Business (10 GB + 500 MB per subscribed user) will affect some of these values. For example, although SharePoint Online for Small Business imposes a limit of 1 TB per site collection, your particular tenant might not have enough storage available to contain a site collection of 1 TB.

 

 Important    It’s a good idea to monitor the Recycle Bin and empty it regularly. Content in the Recycle Bin is counted against the storage quota for a tenant. For example, if the Recycle Bin on a site contains 5 GB of content, that 5 GB is subtracted from the available storage.

 

Limits for SharePoint Online for Office 365 Midsize Business

The following table shows the software boundaries and limits for the SharePoint Online Midsize Business plan.

Feature Description
Storage per user (contributes to total storage base of tenant) 500 megabytes (MB) per subscribed user.
Storage base per tenant 10 GB + 500 MB per subscribed user.

For example, if you have 250 users, the base storage allocation is 135 GB (10 GB + 500 MB * 250 users).

You can purchase additional storage up to a maximum of 20 TB.

Additional storage at a cost per GB per month. To buy storage, see Change storage space for your subscription.

 Important    You can’t buy additional storage for a trial subscription.

Site collection quota limit Up to 1 TB per site collection. (25 GB for a trial).

5,000 items in site libraries, including files and folders.

SharePoint admins can set storage limits for site collections and sites. The minimum storage allocation per site collection is 100 MB.

Site collections (#) per tenant 20 site collections (other than personal sites).
Subsites Up to 2,000 subsites per site collection.
Personal site storage 1TB per user, as soon as provisioned.

Personal site storage applies to a user’s OneDrive for Business library and personal newsfeed. This amount is counted separately, and does not add to or subtract the overall storage allocation for a tenant. For more information about OneDrive for Business, see Additional information about OneDrive for Business limits later in this article.

Public Website storage default 5 GB

A SharePoint admin can allocate up to 1 TB (the limit for a site collection).

File upload limit 2 GB per file.
File attachment size limit 250 MB
Sync limits 20,000 items in the OneDrive for Business library, including files and folders.

5,000 items in site libraries, including files and folders.

Number of users 1 – 250 users
Number of external user invitees There is no limit to number of external users you can invite to your SharePoint Online site Collections. For more information see, Manage external sharing for your SharePoint Online environment

When reviewing the information on the previous table, remember that the base storage limits for Office 365 for Midsize Business (10 GB + 500 MB per subscribed user) will affect some of these values. For example, although SharePoint Online for Midsize Business imposes a limit of 1 TB per site collection and a limit of 20 site collections, your particular tenant might not have enough storage available to contain 20 site collections of 1 TB each.

 Important    It’s a good idea to monitor the Recycle Bin and empty it regularly. Content in the Recycle Bin is counted against the storage quota for a tenant. For example, if the Recycle Bin on a site contains 25 GB of content, that 25 GB is subtracted from the available storage.

 

 

Limits for SharePoint Online for Office 365 Enterprise, Education, and Government

One or more Office 365 subscriptions plans can be included as part of your subscription. This is true for the following plan offerings:

  • Microsoft Office 365 Enterprise subscriptions (E1 – E4)
  • Microsoft Office 365 Government subscriptions (G1 – G4)
  • Microsoft Office 365 Education subscriptions (A2 – A4)
  • Microsoft Office 365 Kiosk subscriptions (K1-K2)
  • SharePoint Online stand-alone subscription plans (Plan 1 and Plan 2).

 

These plans have common boundaries and limits. The following table describes those limits.

 

 

Feature Office 365 Enterprise plans (including E1 – E4, A2-A4, G1-G4, and SharePoint Online Plan 1 and Plan 2) Office 365 Kiosk plans (Enterprise and Government K1 – K2)
Storage per user (contributes to total storage base of tenant) 500 megabytes (MB) per subscribed user. Zero (0).

Licensed Kiosk Workers do not add to the tenant storage base.

Additional storage (per GB per month); no minimum purchase To buy storage, see Change storage space for your subscription.

 Important    You can’t buy additional storage for a trial subscription.

To buy storage, see Change storage space for your subscription.

 Important    You can’t buy additional storage for a trial subscription.

Storage base per tenant 10 GB + 500 MB per subscribed user + additional storage purchased.

For example, if you have 10,000 users, the base storage allocation is approximately 5 TB (10 GB + 500 MB * 10,000 users).

You can purchase an unlimited amount of additional storage.

 Important    If you have a Government Community Cloud plan, you can purchase additional storage up to 25 TB.

10 GB + additional storage purchased.

You can purchase an unlimited amount of additional storage.

 Important    If you have a Government Community Cloud plan, you can purchase additional storage up to 25 TB.

Site collection storage limit Up to 1 TB per site collection. (25 GB for trial).

SharePoint admins can set storage limits for site collections and sites. The minimum storage allocation per site collection is 100 MB.

5,000 items in site libraries, including files and folders.

 Important    If you have a Government Community Cloud plan, the limit is 100 GB.

Up to 1 TB per site collection. (25 GB for a trial). SharePoint admins can set storage limits for site collections and sites. The minimum storage allocation per site collection is 100 MB.

 Important    If you have a Government Community Cloud plan, the limit is 100 GB.

Kiosk workers (plans K1-K2) cannot administer SharePoint site collections. You will need a license for at least one Enterprise plan user to manage Kiosk site collections.

Site collections (#) per tenant 500,000 site collections (other than personal sites). 500,000 site collections.
Subsites Up to 2,000 subsites per site collection Up to 2,000 subsites per site collection
Personal site storage 1 TB per user (100 GB for government plans), as soon as provisioned.

Personal site storage applies to a user’s OneDrive for Business library and personal newsfeed. This amount is counted separately, and does not add to or subtract the overall storage allocation for a tenant.

For more information about OneDrive for Business, see Additional information about OneDrive for Business limits later in this article.

Not available.
Public Website storage default 5 GB

A SharePoint admin can allocate up to 1 TB (the limit for a site collection).

5 GB

A SharePoint admin can allocate up to 1 TB (the limit for a site collection).

Kiosk workers (plans K1-K2) cannot administer Sharepoint site collections. You will need a license for at least one Enterprise plan user to manage Kiosk site collections.

File upload limit 2 GB per file. 2 GB per file.
File attachment size limit 250 MB 250 MB
Sync limits 20,000 items in the OneDrive for Business library, including files and folders.

5,000 items in site libraries, including files and folders.

20,000 items in the OneDrive for Business library, including files and folders.

5,000 items in site libraries, including files and folders.

Maximum number of users per tenant 1 – 500,000+

 Note    If you have more than 500,000 users, please contact the Microsoft representative to discuss detailed requirements.

1 – 500,000+

 Note    If you have more than 500,000 users, please contact the Microsoft representative to discuss detailed requirements.

Number of external user invitees There is no limit to number of external users you can invite to your SharePoint Online site Collections. For more information, see Manage external sharing for your SharePoint Online environment There is no limit to number of external users you can invite to your SharePoint Online site Collections. For more information, see Manage external sharing for your SharePoint Online environment

When reviewing the information on the previous table, remember that the base storage limits for Office 365 for Enterprises (10 GB + 500 MB per subscribed user) will affect some of these values. For example, although SharePoint Online for Enterprise plans imposes a limit of 1 TB per site collection and a limit of 500,000 site collections, your particular tenant might not have enough storage available to contain 500,000 site collections of 1 TB each.

 Important    It’s a good idea to monitor the Recycle Bin and empty it regularly. Content in the Recycle Bin is counted against the storage quota for a tenant. For example, if the Recycle Bin on a site contains 25 GB of content, that 25 GB is subtracted from the available storage.

 

 

Limits for site elements in SharePoint Online

There are also limits for site elements of a SharePoint Online site. Here are some examples:

  • List and Library limits    Different types of columns have different limitations. For example, you can have up to 276 columns in a list for columns that contain a single line of text.
  • Page limits    You can add up to 25 Web Parts to a single wiki or web page.
  • Security limits    Different security features have different limits. For example, a single user can belong to no more than 5,000 security groups.

 

The specific elements for the previous site elements are too numerous to list here, but you can learn more about them in the TechNet article Software Boundaries and Limits for SharePoint 2013. In this linked article, only the sections on List and Library Limits, Page Limits, and Security Limits apply to SharePoint Onl

 

Additional information about OneDrive for Business limits

Each user in SharePoint Online for Office 365 gets an individual storage allocation of 1 TB for personal site content (100 GB for government plans). Personal sites include the user’s OneDrive for Business library, a Recycle Bin, and personal newsfeed information.

All SharePoint Online in Office 365 plans include the same storage allocation for individual personal sites. This storage allocation is separate from the tenant allocation.

For more information about how users can manage their individual OneDrive for Business allocation, see OneDrive for Business library limits.

 

 

Additional Resources

 

For information about this: Go here:
Office 365 connectivity limits To learn more about Internet bandwidth, port and protocol considerations for Office 365 plans, see Office 365 Ports and Protocols.
SharePoint feature availability To learn more about SharePoint feature availability and the SharePoint Online service in Office 365, see SharePoint Online Service Descriptions.
SharePoint Online search limits To learn more about the search limits for SharePoint Online, see Search limits for SharePoint Online.
Mobile devices To learn more about opening a SharePoint Online site from a mobile device, see Use a mobile device to work with SharePoint Online sites.
File types To learn about file types that you can’t add to a list, see Types of files that cannot be added to a list or library.
Online URLs To learn about SharePoint Online addresses, see SharePoint Online URLs and IP Addresses.
Site languages To learn how to set language for your sites, see Change your language and region settings.
Planning and deploying SharePoint Online
Change storage space

 Important    You can’t buy additional storage for a trial subscription.

HTML5 SharePoint Pic Web Part Released and Available !!

This is a Sandbox web part control to display a matrix of image thumbnails.

For a build a Metro IDE or a Picture Gallery to show products, news, or a social team that integrates with pictures, etc. All this, from any SharePoint picture library.

Supports : SharePoint 2010 & 2013 On-Premise Web Part,  SharePoint Online Web Part

FEATURES OF THE WEB PART** ver. 1.0

     

**PREVIEW EXAMPLE OF THE CONTROL**





 
1

SharePoint 2013 and CRM 2011 integration. A customer portal approach

A Look At : Federated Authentication

More and more organisations are looking to collaborate with partners and customers in their ecosystem to help them achieve mutual goals. SharePoint is a great tool for enabling this collaboration but many organisations are reluctant to create and maintain identities for users from other organisations just to allow access to their own SharePoint farm. It’s hardly surprising; identity management is complex and expensive.

You have to pay for servers to host your identity provider (Microsoft Active Directory if you are using Windows); you have to keep it secure; you have to back it up and ensure that it is always available, and you have to pay for someone to maintain and administer it. Identity management becomes even more complicated when your organisation wants to give external users access to SharePoint; you have to ensure that they can only access SharePoint and can’t gain access to other systems; you have to buy additional client access licenses (CALs) for each external user because by adding them to your Active Directory you are making them an internal user.

 

Imageare

Microsoft, Google and others all offer identity providers (also known as IdPs or claims providers) that are free to use, and by federating with a third party IdP you shift the ownership and management of identities on to them. You may even find that the partner or customer you are looking to collaborate with may offer their own IdP (most likely Active Directory Federation Services if they themselves run Windows). Of course, you have to trust whichever IdP you choose; they will be responsible for authenticating the user instead of you so you must be confident that they will do a good job. You must also check what pieces of information about a user (also known as claims; for example, name, email address etc) IdPs offer to ensure they can tell you enough about a user for your purposes as they don’t all offer the same.

Having introduced support for federated authentication in SharePoint 2010, Microsoft paved the way for us to federate with third party IdPs within SharePoint itself. Unfortunately, configuring SharePoint to do this is fiddly and there is no user interface for doing so (a task made more onerous if you want to federate with multiple IdPs or tweak the configuration at a later date). Fortunately Microsoft has also introduced Azure Access Control Services (ACS) which makes the process of federating with one or more IdPs simple and easy to maintain. ACS is a cloud-based service that enables you to manage the IdPs used by your applications. The following diagram illustrates, at a high-level, the components of ACS.

An ACS namespace is a container for mappings between IdPs and one or more relying parties (the applications that want to use ACS), in our case SharePoint. Associated with each mapping is a rule group with defines how the relying party handles the individual claims associated with an identity. Using rule groups you can choose to hide or expose certain claims to specific relying parties within the namespace.

So by creating an ACS namespace you are in effect creating your own unique IdP that encapsulates the configuration for federating with one or more additional IdPs. A key point to remember is that your ACS namespace can be used by other applications (relying parties) that want to share the same identities, not just SharePoint. 

Once your ACS namespace has been created you need to configure SharePoint to trust it, which most of the time will be a one off task and from that point on you can manage and maintain the IdPs you support from within ACS. The following diagram illustrates, at a high-level, the typical architecture for integrating SharePoint and ACS.

 

In the scenario above the SharePoint web application is using two different claims providers (they are referred to as claims providers in SharePoint rather than IdPs). One is for internal users and trusts an internal AD domain and another is for external users and trusts an ACS namespace.

When a user tries to access a site within the web application they will get the default SharePoint Sign In page asking them which provider they want to use.

This page can be customised and branded as required. If the user selects Windows Authentication they will get the standard authentication dialog. If they select Azure Provider (or whatever you happen to have called your claims provider) they will be redirected to your ACS Sign In page.

Again this page can be customised and branded as required. By clicking on one of the IdPs the user will be redirected to the appropriate Sign In page. Once they have been successfully authenticated by the IdP they will be redirected back to SharePoint.

 

Conclusion

By integrating SharePoint with ACS you can simplify the process of giving external users access to SharePoint. It could also save you money in licence fees and administration costs[i].

An important point to bear in mind when planning federated authentication for SharePoint is that in order for Search to be able to index content within SharePoint, you must enable Windows authentication on at least one zone within your web application. Also, if you use a reverse proxy to perform authentication, such as Microsoft Threat Management Gateway, before allowing traffic to hit your SharePoint servers, you will need to disable the authentication checks

 

[i] The licensing model for external users differs between SharePoint 2010 and SharePoint 2013. With SharePoint 2010 if you expose your farm to external users, either anonymously or not, you have to purchase a separate licence for each server. The license covers you for any number of external users and you do not need to by a CAL for each user. With SharePoint 2013, Microsoft did away with the server license for external users and you still don’t need to buy CALs for the external users.

In Depth Look : Private Cloud Infrastructure as a Service Capabilities

saas[1]

 

The primary purpose of a Private Cloud Infrastructure as a Service capability is to provide well managed infrastructure services to the Platform and Software Layers. To achieve this, the Infrastructure Layer, highlighted in the Private Cloud Reference Model diagram below, includes five capabilities.


Figure 1: Private Cloud Reference Model

This document describes these Infrastructure Layer capabilities and the impact of Private Cloud Infrastructure as a Service (IaaS) patterns on their planning and design. These patterns are defined in the Private Cloud Principles, Concepts, and Patterns document and are summarized here:

  • Resource Pooling: Divides resources into partitions for management purposes.
  • Physical Fault Domain: The group of physical resources dependent on a single point of failure such as an Uninterruptible Power Supply (UPS).
  • Upgrade Domain: A group of resources upgraded as a single unit.
  • Reserve Capacity: Unallocated resources, which take over service in the event of a failed Physical Fault Domain.
  • Scale Unit: A collection of resources treated as a single unit of additional capacity.
  • Capacity Plan: A model that enables a private cloud to deliver the perception of infinite capacity.
  • Health Model: Defines how a service or system may remain healthy.
  • Service Class: Defines services delivered by Infrastructure as a Service.
  • Cost Model: The financial breakdown of a private cloud and its services.

The Health Model, Service Class, and Scale Unit patterns directly affect Infrastructure and are detailed in the relevant sections later. Conversely, private cloud infrastructure design directly affects Physical Fault Domains, Upgrade Domains, and the Cost Model. These relationships are shown in Figure 2 below.


Figure 2: Infrastructure Relationship with Patterns

Background

The private cloud principles “perception of continuous availability” and “resiliency over redundancy mindset” are designed to make a private cloud architect think differently.

Traditional solutions rely heavily on redundancy to achieve high availability and avoid failure. But redundancy at the facility (power) and infrastructure (network, server, and storage) layers is very costly. Modern cloud applications are designed with a different, holistic approach to achieving availability. This means shifting focus from building redundancy into the facility and physical infrastructure to engineering the entire solution to handle failures — eliminating them, or at least minimizing their impact.

This approach to availability relies on resilience as well as redundancy. Resilience means rapid, and ideally automatic, recovery from a failure. Redundancy is typically achieved at the application level. (A non-cloud example is Active Directory®, where redundancy is achieved by providing more domain controllers than is needed to handle the load.)

Customer interest in cost reduction will help drive adoption of this approach over the medium term. Removing power redundancy from racks or co-location rooms has a big impact on operational expenses, but this typically occurs only when the hosted application doesn’t have to be highly available, or when high availability is achieved through redundancy at the application layer – for example, Active Directory replication, or application layer mirroring such as SQL Server™ mirroring. Combining reductions in physical redundancy with virtualization results in lower capital and operational expenditure compared to a highly redundant infrastructure.

Applications that depend on a highly available infrastructure will not achieve their Service-Level Agreement (SLA) when placed on the type of infrastructure defined earlier. Customers are therefore likely to develop two environments when designing their private cloud: a standard environment with reduced facility and infrastructure redundancy, and a high-availability environment with traditional levels of redundancy.

Standard Environment

High-Availability Environment

No power redundancy to the rack (for example one in-rack UPS) Redundant power to each server
No network redundancy to the servers (redundant core network) Redundant network connections to each server
Local storage, possibly redundant storage, and storage network Redundant storage presented to each server
Ideally no migration or possibly quick migration Live Migration

These two environments allow a Architect to differentiate service classifications from a high-availability perspective. The standard environment is appropriate for stateless workloads; stateful workloads will require the high availability environments. Stateful and stateless machines are managed differently. Statefulness will likely appear as a characteristic of the service classifications.

Stateless workloads (web servers, for example) are typically redundant at the server level via a load-balanced farm. These servers could easily be hosted in the Standard Environment. If all stateless workloads had an automated build, the Standard Environment could do away with any form of VM migration – and simply deploy another VM after destroying the existing one, thereby saving the cost of shared storage.

Stateful workloads, on the other hand, require a specific management approach and impose higher costs on the consumer. Unless designed for high availability at the application level, they will require some form of redundancy in the infrastructure. Further, the High-Availability Environment requires Live Migration to enable maintenance of the underlying fabric and load balancing of the VMs.

Security

The number one concern of customers considering moving services to the cloud is security. Recent concerns expressed in the industry forums are all well founded and present reasons to think through the end to end scenarios and attack surfaces presented when deploying multiple services from various departments in an organization on a private cloud.

In a cloud-based platform, regardless of whether it is private or public cloud, customers will be working on an essentially virtualized environment. The platform or software will run on top of a shared physical infrastructure managed internally or by the service provider. The security architecture used by the applications will need to move up from the infrastructure to the platform and application layers. In private cloud security this will provide security in addition to the perimeter network.

Public cloud involves handing over control to a third party, sharing services with unrelated business entities or even competitors and requires a high degree of trust in the providers security model and practices. In many ways the security concerns of private cloud and similar those of self-hosted or outsourced datacenter however the move to a virtualized self-service service oriented paradigm inherent in private cloud computing introduces some additional security concerns.

First is the isolation of tenants from each other and the hosting infrastructure at both the compute and network layers. Virtualization is a part of any private cloud strategy and the security of this model is totally dependent on the ability to isolate one tenant from another and prevent the careless or malicious tenant from impacting the stability of the core infrastructure upon which all tenants rely.

Another concern is Authentication, Authorization and Auditing of access to the cloud services. Self-service implies that tenant administrators can initiate management processes and workflows where previously this was accomplished through IT. For any misconfiguration or excessive permissions granted to these users can impact the stability or security of the cloud solution.

Many private cloud security concerns are also shared by traditional datacenter environment which is not surprising since the private cloud is just an evolution of the traditional datacenter model. These include:

  • Impact the confidentiality, integrity or availability through exploitation of software vulnerabilities.
  • Unauthorized access due to weak or misconfiguration.
  • Impact to confidentiality, integrity or availability by malicious code.
  • Impact to confidentiality, integrity or availability of data.
  • Compliance with internal or industry specific regulations and standards.

Secure Virtualization Platform

The biggest risk in running in a multi-tenant virtualized environment is that a tenant running services on the same physical infrastructure as you can break out of its isolating partition and impact the confidentiality, integrity or availability of your workload and data. Therefore the security in virtualization platform is key in the isolation and non-interference between the individual virtual machines running on the infrastructure.

Highly Automated Management, Monitoring and Reporting

Many management tasks involve multiple steps that must be completed in the proper sequence by multiple administrators across multiple systems. Any shortcuts, omissions or errors can leave assets vulnerable to unauthorized access or affect the reliability of components within a solution. By orchestrating discreet management and monitoring tasks into workflows that require proper authorization and approval greatly diminish the chance of mistakes that affect the security of the solution.

Authentication, Authorization and Auditing

Most organizations have a common capability for providing an overarching framework for authentication and access control and then a private cloud introduces all parts of hosting and hosted services that include the hosting infrastructure and the virtual machines workloads that run in that infrastructure. This framework must be designed and possibly extended to provide a single point of managing identities and credentials, authentication services and common security model for access to resources across the private cloud.

Multi-layer Security

Moving to a cloud-based platform requires a change in mind-set of developers and IT security professionals. Some of the risks of the public cloud are mitigated by using a private cloud architecture, however, the perimeter security protecting a private cloud should be seen as an addition to public cloud security practices, not an alternative. You cannot apply the traditional defense-in-depth security models directly to cloud computing, however you should still apply the principal of multiple layers of security. By taking a fresh look at security when you move to a cloud-based model, you should aim for a more secure system rather than accepting security that continues with the current levels.

Security Governance

Enterprise IT systems are now typically well regulated and controlled. The security risks are well documented and therefore proper processes are put in place to develop new applications and systems, or to provision them from 3rd party vendors. It is very unlikely that a department manager would be able to purchase and install software without approval from the IT department.

With public cloud systems and Web browser clients however, it is possible that individual department managers could bypass the IT department and provision public cloud-based software. Indeed, they might use free cloud storage systems as a convenient means to synchronize documents without even considering that they are using public cloud services. Public cloud systems might be appealing to a manager as they could very quickly provision a new system and remove what they might see as unnecessary bureaucracy. They may even be unaware of the security and compliance policies that are in place to protect the organization. In a cloud-based landscape, we must protect corporate systems and data from these unauthorized, untested systems.

Facilities

Facilities represent the physical components – buildings, racks, power, cooling, and physical interconnects – that house or support a private cloud. It is beyond the scope of this document to provide detailed guidance on facilities, but the private cloud principles affect facility design.

The definition of a Scale Unit impacts power, cooling, space, racking, and cabling requirements. The team that defines a Scale Unit should include personnel that design and manage these aspects of the facility in addition to the procurement, Capacity Planning, and Service Delivery teams. The following table lists some ramifications of Scale Unit size choices from a facilities perspective.

Small Scale Unit

Large Scale Unit

Benefit

Trade-off

Benefit

Trade-off

  • Lower amount of physical labor needed to add a Scale Unit
  • Complicates the Resource Pool, Fault Domain, and Reserve Capacity equation
  • Inefficient
  • Stranded power (un-utilized power)
  • Un-utilized space
  • Allocation of full facilities units (for example, UPS, Rack, and Co-location Room) is easy to cost and engineer
  • Reduces under-utilization of power, cooling, and space
  • Higher amount of labor to commission

Knowing how much power, cooling, and space each Scale Unit will consume enables the facilities team to perform effective Capacity Planning and the engineering team to effectively plan resources.

Compute, Network, and Storage Fabric

The term Fabric defines a collection of interconnected compute, network, and storage resources.

The concept of homogenous physical infrastructure, introduced in the Private Cloud Principles, Concepts, and Patterns guide, stipulates that all servers in a Resource Pool should be identical. Homogenizing the compute, storage, and network components in servers allows for predictable scale and performance. In other words, every server in a Resource Pool should have the same processor characteristics such as family (Intel/AMD), number of cores/CPUs, and generation (Xeon 2.6 Gigahertz (GHz)). The homogenized compute concept also stipulates that each server have the same amount of Random Access Memory (RAM) and the same number of connections to Resource Pool storage and networks. With these specifications met, any virtualized service could relocate from one failing or failed physical server to another physical server and continue to function identically.

Physical Server

The physical server hosts the hypervisor and provides access to the network and shared storage. In the Standard Environment, the facilities do not provide power redundancy, so the servers do not require dual power supplies.

Every server will be a member of a single compute Resource Pool and a single Physical Fault Domain. Assuming all servers are homogeneous (as recommended), they will all be members of a single Upgrade Domain.

Capacity Planning must be done for each server specification, as its size (CPU and RAM specification) will determine how many virtual machines it is able to host. This is covered in greater detail in the Private Cloud Planning Guide for Service Delivery.

Server specification selection impacts the Scale Unit, Cost Model, and service class. Scale Units have a finite amount of power and cooling, so server efficiency has an impact on a private cloud. It may be that all power in a Scale Unit is consumed before all physical space. The cost of servers impacts the Cost Model irrespective of whether this cost is passed onto the consumer. Selecting only small one-unit servers will limit the architect’s ability to define a range of service classifications. The server needs to accommodate the largest service classification after the parent partition and hypervisor consume their resources.

Microsoft research shows servers with processors one or two models behind the latest versions offer a better price, performance, and power consumption ratio than the newer processors.

The Private Cloud Reference Architecture dictates that the “concept of homogenization of physical infrastructure” be adopted for each Resource Pool. Server specifications (CPU, RAM) may vary between Resource Pools, but this complicates Fabric Management (defined in the Private Cloud Planning Guide for Systems Management), which spans Resource Pools and Capacity Planning, and may necessitate different service classes for each pool.

Delivering IaaS requires that the service is pre-defined and delivered consistently. To achieve consistent performance, the VMs must have equal resources available to them from each server, in other words, the same CPU cycles and RAM. If servers within a Resource Pool do not provide homogeneous performance and RAM, consistent performance cannot be guaranteed.

Absolute homogenization may be hard to maintain over the long term as server models may be discontinued by the vendor; therefore relationships between Resource Pools, Scale Units, and server model longevity must be considered carefully.

The following table lays out some of the benefits and trade-offs of homogeneous and heterogeneous Resource Pools.

Homogeneous Physical Infrastructure

Heterogeneous Physical Infrastructure

Benefit

Trade-off

Benefit

Trade-off

  • Predictable performance within a Resource Pool
  • Guaranteed Live Migration across the fabric
  • Reuse of existing equipment may not be possible
  • Possible reuse of existing equipment
  • Allows for a broader range of server classes
  • VMs cannot be moved between Resource Pools
  • More upfront work to make sure Live Migration will work appropriately

In addition, servers should support the following requirements to achieve an automated infrastructure and resiliency:

Automated Infrastructure

  • Wake On Local Area Network
  • Remote BIOS Upgrades/Configurations
  • Boot from Flash
  • Pre-Boot Execution Environment (PXE) for remote imaging
  • Virtualization Support
    • Data Execution and Prevention
    • 64 bit CPUs
  • Standard Environment: 2 Network adapters that support TCP offload (TOE)
    • Management x 1
    • Consumer x 1
  • High-Availability Environment: 4 or 6 redundant network adapters that support TOE
    • Management x 2: Could be teamed for redundancy
    • Live Migration x 2: Could be teamed for redundancy
    • Consumer x 2: Could be teamed for resiliency
  • Standard Environment: Storage connections that meet the required service classification
    • For Internet Small Computer System Interface (iSCSI), 1 x Hardware iSCSI initiators: Could use vendor-specific software to achieve resiliency
    • For Fiber Channel, 1 x Fiber Channel host bus adapter (HBA): – Could use vendor-specific software to achieve resiliency
  • High-Availability Environment: Redundant storage connections that meet the required service classification
    • For iSCSI, 2 x Hardware iSCSI initiators: Could use vendor-specific software to achieve resiliency
    • For Fiber Channel, 2x Fiber Channel HBAs: Could use vendor-specific software to achieve resiliency

To dynamically initiate remediation events in case of failure or impending failure of server components, each server is required to display warnings, errors, and state information for the following:

  • CPU
    • State (Busy/Ready)
    • Utilization
    • Heat
    • Fans
  • RAM
    • Utilization
    • Error-Correcting Code (ECC) Errors
  • Storage
    • Read/Write Failures
    • Predictive Failures
  • Network Interface Cards (NICs)
    • Port State
    • Send/Receive Errors
  • Motherboard
    • Server Post Errors
  • Power Supply
    • State
    • Active / Passive
    • Power Output Variations
  • Fans
    • Speed
    • State

Storage

To achieve the perception of infinite capacity, proactive Capacity Management must be performed, and storage capacity added ahead of demand. The amount of storage added as a single unit (a Storage Scale Unit) will depend on the rate of storage consumption, hardware vendor lead time, and the level of risk the business wishes to assume (that is, weighing remaining unallocated capacity against the possibility of exhausting all capacity). This is detailed in Private Cloud Planning Guide for Service Delivery.

Storage will be placed in Storage Resource Pools, from which it is automatically allocated to consumers. Though Resource Pools are not a new concept for Storage Area Networks (SANs), allowing the infrastructure to allocate storage on-demand based on policy may be a new approach for many organizations. Further, the SAN must present an application programming interface (API) to Fabric Management to allow automation of allocation and provisioning.

The storage provided within a private cloud must be consistent in performance and availability. This means the Input/output (I/O) Operations per Second (IOPS) cannot vary significantly. If there is a need to make different levels of storage performance available to users of a private cloud, it can be accomplished through multiple service classifications. A private cloud is intended, however, to provide a limited set of standardized services; therefore, variances should be carefully considered.

The cost of providing the storage within a private cloud should be clearly defined. This permits metering, and possibly allocation of costs to consumers. If different classes of storage are provided for different levels of performance, their costs should be differentiated. For example, if SAN is being used in an environment, it is possible to have storage tiers where faster Solid State Drives (SSD) are used for more critical workloads. Less-critical workloads can be placed on a Tier 2 Secure Attention Sequence (SAS), and even less-critical workloads on Tier 3 SATA drives.

The Private Cloud Reference Architecture assumes the storage arrays and the storage network are redundant, with no single point of failure beyond the array itself. In this regard, the storage array can be considered a Fault Domain.

The design should adopt some form of de-duplication technology to reduce storage consumption.

As the storage array is a single point of failure, it should display health information to the systems monitoring service to make sure that any outages and their impact are quickly identified. Providing snapshots and mirroring between arrays for continuity is beyond the scope of this guidance.

Physical Storage Switches

If a Architect follows the recommendation to allow any VM to execute on any server in a Resource Pool, Virtual Hard Disks (VHDs) should reside on a SAN. While it is possible to host VHDs locally, the guidance assumes that they are hosted on a SAN.

A key decision in private cloud design is whether to use iSCSI or Fiber Channel for storage. If iSCSI is utilized to house virtual workload storage, it is suggested that each virtualization host include iSCSI HBAs instead of standard NICs for performance reasons.

The purpose of a storage switch is to provide resilient and flexible connectivity between shared storage and physical servers. The storage switch must meet peak storage I/O requirements for the virtual services. In addition, the interconnect speeds between switches should be evaluated to determine the maximum throughput for switch-to-switch communications. This may limit the maximum number of hosts that can be placed on each switch.

While switch throughput is important, attention should also be paid to the number of available switch ports needed to support the physical virtualization hosts. Refer to the switch hardware vendor to make sure it meets these requirements.

Physical storage switch requirements include:

  • Dedicated switch port on each switch for each host and storage processor connection. This is needed for redundancy and I/O optimization.
  • iSCSI traffic separated from all other IP traffic, preferably on its own switched infrastructure or logically through a virtual local area network (VLAN) on a shared IP switch. This segregates data access from traditional network communications for host-to-host and workload operations and provides data security.
  • Redundant power supplies and cooling fans increase the number of faults the storage switch can withstand.
  • Programmatic interface to support firmware upgrades and configuration.

Physical Storage Subsystem

Stateless workloads can be hosted on Direct-Attached Storage (DAS) instead of SAN, driving down the cost of service. The downside is that Fabric Management has to handle transitioning active user connections between VMs homed on different hosts, as VM migration is impossible. This may mean tighter integration with the network than is specified in this document (in order to know when all connections to a VM have been abandoned or terminated before stopping the VM, for example).

SAN storage, while more expensive, provides advantages:

  • The VM can be re-homed to other servers.
  • Live Migration can be employed.
  • Backup (of the VM) can occur out of band (for example, taking snapshots).
  • Capacity can be increased almost limitlessly.

The logical storage configuration (or storage classification) should be designed to meet requirements in the following areas:

  • Capacity: To provide the required storage space for the virtual service data and backups.
  • Performance Delivery: To support the required number of IOPS and throughput.
  • Fault Tolerance: To provide the desired level of protection against hardware failures. If a SAN is used, this may include redundant HBA and switches.
  • Manageability: To provide a high degree of platform self-management. This requires a programmatic interface to provide automated configuration and firmware upgrades.

Additionally, a private cloud must meet the following requirements to make sure that it is highly available and well-managed:

  • Multiple paths to the disk array for redundancy. Should a disk fail, hot or warm spare disks can provide resiliency in the provisioned storage. Consult the storage vendor for specific recommendations.
  • A storage system with automatic data recovery, to allow an automatic background process to rebuild data onto a spare or replacement disk drive when another disk drive in the array fails.
  • Redundant power supplies and cooling fans, to increase the number of faults the Storage Array can withstand.

Network

The Private CLoud Reference Architecture assumes that the network presented to servers is not redundant for the Standard Environment and is redundant for the High-Availability Environment.

The network is tightly coupled with physical servers. Each Compute Resource Pool includes the network switches necessary for the servers to operate; each Scale Unit includes a pre-defined and fixed number of servers and switches.

The switches must be monitored to make sure no workloads saturate the network. A private cloud is designed as a general-purpose infrastructure. Workloads that challenge the network with high utilization may not be good candidates for a private cloud unless separate Resource Pools are created specifically to handle these workloads.

Switches are members of network upgrade domains, but the definition and membership of upgrade domains will likely vary depending on the nature of the upgrade. If switches are not redundant (for example, in the Standard Environment), the whole Resource Pool will need to be taken offline for switch maintenance, which requires switch reboots.

Network hardware (switches and load balancers) must display an API to Fabric Management that enables automated management of networks such as creation of VLANs, Virtual IP addresses (VIP), and addition or removal of hosts from the VIP.

Physical Network

Some key decisions that should be made to increase the bandwidth of the physical networks are related to the use of Live Migration requirements of port security, and the need for link aggregation. Here is a table showing the benefits and trade-offs of using Live Migration:

Use Live Migration

Do Not Use Live Migration

Benefit

Trade-off

Benefit

Trade-off

  • Transparent movement of Stateful applications
  • Transparent infrastructure upgrades
  • Additional network switch ports will be required
  • More network adapters are required per virtualization host
  • Greater Reserve Capacity may be required because of cluster size limitations of 16 nodes
  • Less switch ports are required
  • Fewer network adapters are required per virtualization host
  • Ideal for stateless applications
  • No transparent movement of Stateful applications
  • For Stateful applications, infrastructure upgrades will need to be coordinated with VM owners

To support the dynamic characteristics of a private cloud, a network switch should support a remote programmatic interface – for firmware upgrades, and prioritization of traffic for quality of service. These switches should be dedicated for a private cloud to maintain predictable performance and to minimize risks associated with human interaction. As defined earlier, the servers need to be connected to at least two networks, management and consumer, with live migration (if required). The connections should always be the same; for example, network adapter 1 to management, network adapter 2 to consumer, and network adapter 3 to Live Migration.

If iSCSI is chosen for the storage interconnects, iSCSI traffic should reside in an isolated VLAN in order to maintain security and performance levels. This iSCSI traffic should not share a network adaptor with other traffic, for example the management or consumer network traffic.

The interconnect speeds between switches should be evaluated to determine the maximum bandwidth for communications. This could affect the maximum number of hosts which can be placed on each switch.

When designing network connectivity for a well-managed infrastructure, the virtualization hosts should have the following specific networking requirements:

  • Support for 802.1Q VLAN Tagging: To provide network segmentation for the virtualization hosts, supporting management infrastructure and workloads. This is the preferred method to help secure and isolate data traffic for a private cloud.
  • Remote Out-of-band Management Capability: To monitor and manage servers remotely over the network regardless of whether the server is turned on or off.
  • Support for PXE Version 2 or Later: To facilitate automated server provisioning.

To dynamically initiate remediation events in response to the failure or impending failure of network switch components, each switch is required to display warnings, errors, and state information for the following:

  • CPU
    • Utilization
    • Temperature
  • Flash Memory
    • Utilization
  • Interface Details
    • Port State
    • Port Errors
    • Bandwidth Utilization
  • Power Supply
    • State
    • Active / Passive
    • Power Output Variations
  • Fans
    • Speed
    • State

Storage Switch/Subsystem Health Model

To dynamically initiate remediation events in response to either the failure or impending failure of storage switches and storage subsystem components, each component is required to display warnings, errors, and state information for the following:

Storage Switch

  • CPU
    • Utilization
    • Temperature
  • Flash Memory
    • Utilization
  • Interface Details
    • Port State
    • Port Errors
    • Bandwidth Utilization
  • Power Supply
    • State
    • Active / Passive
    • Power Output Variation
  • Fans
    • Speed
    • State

Storage Subsystem

  • CPU
    • Utilization
    • Temperature
  • Flash Memory
    • Utilization
  • Service Processor
    • State
    • Errors
    • IOPS
  • Disks
    • Read / Write Failures
    • Predictive Failures
  • Power Supply
    • State
    • Active / Passive
    • Power Output Variations
  • Fans
    • Speed
    • State

Hypervisor

The hypervisor exposes the VM services to consumers. It needs to be configured identically on all hosts in a Resource Pool, and ideally all hosts in the private cloud. Fabric Management will orchestrate the addition of virtual switches, machines, and disks.

An architect needs to decide whether the private cloud should use CPU Resource Reservations to make sure of predictable performance of VMs. This table lists the benefits and trade-offs:

Use CPU Resource Reservations

Do Not Use CPU Resource Reservations

Benefit

Trade-off

Benefit

Trade-off

  • Consistent VM performance for consumers
  • Fixed number of VMs per host might lead to low utilization of resources
  • Toolset may not set resource reservations
  • Variable number of VMs per host means resource utilization can be maximized
  • Consumers do not experience consistent VM performance
  • One VM can adversely affect the processing performance of others

The decision is driven by whether efficiency or consistency is more important for the private cloud.

The architect could elect to provide different classes of services – one which uses resource reservations to deliver predictability, and another which shares the resources. Separate Resource Pools could be deployed accordingly, along with differential pricing to incent the consumers to exhibit desired behavior.
Resource reservations will not prevent a host from saturating the network and crippling the performance of other hosts. As stated in the Network section earlier, this needs monitoring.

Parent Partition

The parent partition provides the hypervisor with access to physical resources such as network and storage. It also hosts the hypervisor management interfaces. The parent partition needs to be configured identically on all servers in a Resource Pool.

If an architect elects to create a service classification which depends on consuming LUNs directly (not via the parent partition), the parent partition must be configured to present the pass-through for this storage. Further, this storage must be available to all parent partitions in that Resource Pool to enable VM portability between hosts.

The parent partition displays health information for the server, the parent partition operating system, and the hypervisor. The health monitoring system, in turn, consumes this information to enable Capacity Management and Fabric Management.

Management Layers

Task Execution

Task execution is the low level management operations that can be performed on a platform and generally are surfaced through the command line or Application Programming Interface (API). The capability to execute tasks must not only exist but the usage semantics should be consistent across members of a fault domain to enable automation using a common format. When differences in semantics exist this forces the automation layer to compensate for these differences through custom code in the orchestration or even require using different execution hosts or engines within a fault domain.

Automation

The automation layer is made up of the foundational automation technology plus a series of single purpose commands and scripts that perform operations such as starting or stopping a virtual machine, restarting a server, or applying a software update. These atomic units of automation are combined and executed by higher-level management systems. The modularity of this layered approach dramatically simplifies development, debugging, and maintenance.

Orchestration

In much the same way that an enterprise resource planning (ERP) system manages a business process such as order fulfillment and handles exceptions such as inventory shortages, the orchestration layer provides an engine for IT-process automation and workflow. The orchestration layer is the critical interface between the IT organization and its infrastructure and transforms intent into workflow and automation.

Ideally, the orchestration layer provides a graphical user interface in which complex workflows that consist of events and activities across multiple management-system components can be combined, to form an end-to-end IT business process such as automated patch management or automatic power management. The orchestration layer must provide the ability to design, test, implement, and monitor these IT workflows.

Service Management

Service management provides the means for automating and adapting IT service management best practices, such as those found in the IT Infrastructure Library (ITIL), to provide built-in processes for incident resolution, problem resolution, and change control.

Self Service

Self Service capability is a characteristic of private cloud computing and must be present in any implementation. The intent is to permit users to approach a self-service capability and be presented with options available for provisioning in an organization. The capability may be basic where only provisioning of virtual machine with a pre-defined configuration or may be more advanced allowing configuration options to the base configuration and leading up to a platform capability or service.

Self service capability is a critical business driver that enables members of an organization to become more agile in responding to business needs with IT capabilities to meet those needs in a manner that aligns and conforms with internal business IT requirements and governance.

This means the interface between IT and the business are abstracted to simple, well defined and approved set of service options that are presented as a menu in a portal or available from the command line. The business selects these services from the catalog, begins the provisioning process and notified upon completions, the business is then only charged for what they actually use.

This is analogous to capability available on Public Cloud platforms.

The entities that consume self service capabilities in an organization are individual business units, project teams, or any other department in the organization that have a need to provision IT resources. These entities are referred to as Tenants. In a private cloud tenants are granted the ability to provision compute and storage resources as they need them to run their workload. Connectivity to these resources is managed behind the scenes by the fabric management layers of the private cloud.

Tenant administrators are granted access to a self-service portal where they can initiate workflows to provision virtualized services in the appropriate configuration and capacity. For example compute resources may be available in small, medium or large instance capacities and also storage of the appropriate size and performance characteristics. Resources are provisioned without any intervention from infrastructure personnel in IT and the overall progress is tracked and reported by the fabric management layer and reported through the portal.

A chargeback model is defines how tenants will be charged for using the cloud resources. This is typically the numbers and size of resources provisioned times the amount of time they are provisioned for. This information is available to tenant administrators through the self-service portal and well as the ability to provide cost reporting.

Tenants are granted the ability to manage, monitor and report on the resources that they have provisioned.

DRY Architecture, Layered Architecture, Domain Driven Design and a Framework to build great Single Web Pages – BiolerPlate Part 1

DRY – Don’t Repeat Yourself! is one of the main ideas of a good developer while developing a software. We’re trying to implement it from simple methods to classes and modules. What about developing a new web based application? We, software developers, have similar needs when developing enterprise web applications.

Enterprise web applications need login pages, user/role management infrastructure, user/application setting management, localization and so on. Also, a high quality and large scale software implements best practices such as Layered Architecture, Domain Driven Design (DDD), Dependency Injection (DI). Also, we use tools for Object-Releational Mapping (ORM), Database Migrations, Logging… etc. When it comes to the User Interface (UI), it’s not much different.

Starting a new enterprise web application is a hard work. Since all applications need some common tasks, we’re repeating ourselves. Many companies are developing their own Application Frameworks or Libraries for such common tasks to do not re-develop same things. Others are copying some parts of existing applications and preparing a start point for their new application. First approach is pretty good if your company is big enough and has time to develop such a framework.

As a software architect, I also developed such a framework im my company. But, there is some point it feels me bad: Many company repeats same tasks. What if we can share more, repeat less? What if DRY principle is implemented universally instead of per project or per company? It sounds utopian, but I think there may be a starting point for that!

What is ASP.NET Boilerplate?

http://www.aspnetboilerplate.com/

ASP.NET Boilerplate [1] is a starting point for new modern web applications using best practices and most popular tools. It’s aimed to be a solid model, a general-purpose application framework and a project template. What it does?

  • Server side
    • Based on latest ASP.NET MVC and Web API.
    • Implements Domain Driven Design (Entities, Repositories, Domain Services, Application Services, DTOs, Unif Of Work… and so on)
    • Implements Layered Architecture (Domain, Application, Presentation and Infrastructure Layers).
    • Provides an infrastructure to develop reusable and composable modules for large projects.
    • Uses most popular frameworks/libraries as (probably) you’re already using.
    • Provides an infrastructure and make it easy to use Dependency Injection (uses Castle Windsor as DI container).
    • Provides a strict model and base classes to use Object-Releational Mapping easily (uses NHibernate, can work with many DBMS).
    • Implements database migrations (uses FluentMigrator).
    • Includes a simple and flexible localization system.
    • Includes an EventBus for server-side global domain events.
    • Manages exception handling and validation.
    • Creates dynamic Web API layer for application services.
    • Provides base and helper classes to implement some common tasks.
    • Uses convention over configuration principle.
  • Client side
    • Provides two project templates. One for Single-Page Applications using Durandaljs, other one is a Multi-Page Application. Both templates uses Twitter Bootstrap.
    • Most used libraries are included by default: Knockout.js, Require.js, jQuery and some useful plug-ins.
    • Creates dynamic javascript proxies to call application services (using dynamic Web API layer) easily.
    • Includes unique APIs for some sommon tasks: showing alerts & notifications, blocking UI, making AJAX requests.

Beside these common infrastructure, the “Core Module” is being developed. It will provide a role and permission based authorization system (implementing ASP.NET Identity Framework), a setting systems and so on.

What ASP.NET Boilerplate is not?

ASP.NET Boilerplate provides an application development model with best practices. It has base classes, interfaces and tools that makes easy to build maintainable large-scale applications. But..

  • It’s not one of RAD (Rapid Application Development) tools those provide infrastructure for building applications without coding. Instead, it provides an infrastructure to code in best practices.
  • It’s not a code generation tool. While it has several features those build dynamic code in run-time, it does not generate codes.
  • It’s not a all-in-one framework. Instead, it uses well known tools/libraries for specific tasks (like NHibernate for O/RM, Log4Net for logging, Castle Windsor as DI container).

Getting started

In this article, I’ll show how to deleveop a Single-Page and Responsive Web Application using ASP.NET Boilerplate (I’ll call it as ABP from now). This sample application is named as “Simple Task System” and it consists of two pages: one for list of tasks, other one is to add new tasks. A Task can be related to a person, can be completed. The application is localized in two languages. Screenshot of Task List in the application is shown below:

A screenshot of 'Simple Task System'

Creating empty web application from template

ABP provides two templates to start a new project (Even if you can manually create your project and get ABP packages from nuget, template way is much more easy). Go to www.aspnetboilerplate.com/Templates to create your application from one of twotemplates (one for SPA (Single-Page Application), one for MPA (classic, Multi-Page Application) projects):

Creating template from ABP web site

I named my project as SimpleTaskSystem and created a SPA project. It downloaded project as a zip file. When I open the zip file, I see a solution is ready that contains assemblies (projects) for each layer of Domain Driven Design:

Project files

Created project’s runtime is .NET Framework 4.5.1, I advice to open with Visual Studio 2013. The only prerequise to be able to run the project is to create a database. SPA template assumes that you’re using SQL Server 2008 or later. But you can change it easily to another DBMS.

See the connection string in web.config file of the web project:

<add name="MainDb" connectionString="Server=localhost; Database=SimpleTaskSystemDb; Trusted_Connection=True;" />

You can change connection string here. I don’t change the database name, so I’m creating an empty database, named SimpleTaskSystemDb, in SQL Server:

Empty database

That’s it, your project is ready to run! Open it in VS2013 and press F5:

First run

Template consists of two pages: One for Home page, other is About page. It’s localized in English and Turkish. And it’s Single-Page Application! Try to navigate between pages, you’ll see that only the contents are changing, navigation menu is fixed, all scripts and styles are loaded only once. And it’s responsive. Try to change size of the browser.

Now, I’ll show how to change the application to a Simple Task System application layer by layer in the coming part 2

Introduction to Cloud Automation


Provision Azure Environment Resources


This is where we can see proof of evolution.

As you saw in the bulleted list of chronological blog posts (above), my first venture into Automating the Public Cloud leveraged Orchestrator + The Integration Pack for Windows Azure. My second releaseleveraged PowerShell and PowerShell Workflow + Windows Azure Cmdlets.

Let’s get down to the goods. And actually, for the first time in a long time, my published example came out a couple days before the blog post / teaser!


Script Center Contribution and Download

The download is the example: New-AzureEnvironmentResources.ps1

Here is a brief description:

This runbook creates a number of Azure Environment Resources (in sequence): Azure Affinity Group, Azure Cloud Service, Azure Storage Account, Azure Storage Container, Azure VM Image, and Azure VM. It also requires the Upload of a VHD to a specified storage container mid-process.

A detained Description, full set of Requirements, and the actual Runbook Contents are available within the Script Center Contribution (not to mention, the actual download).

Download the Provision Azure Environment Resources Example Runbook from Script Center here:

BC-DLButtonDark


A bit more about the Requirements…

Runbook Parameters

  • Azure Connection Name

    REQUIRED. Name of the Azure connection setting that was created in the Automation service.
        This connection setting contains the subscription id and the name of the certificate setting that
        holds the management certificate. It will be passed to the required and nested Connect-Azure runbook.

  • Project Name

    REQUIRED. Name of the Project for the deployment of Azure Environment Resources. This name is leveraged
        throughout the runbook to derive the names of the Azure Environment Resources created.

  • VM Name

    REQUIRED. Name of the Virtual Machine to be created as part of the Project.

  • VM Instance Size

    REQUIRED. Specifies the size of the instance. Supported values are as below with their (cores, memory)
        “ExtraSmall” (shared core, 768 MB),
        “Small”      (1 core, 1.75 GB),
        “Medium”     (2 cores, 3.5 GB),
        “Large”      (4 cores, 7 GB),
        “ExtraLarge” (8 cores, 14GB),
        “A5”         (2 cores, 14GB)
        “A6”         (4 cores, 28GB)
        “A7”         (8 cores, 56 GB)

  • Storage Account Name

    OPTIONAL. This parameter should only be set if the runbook is being re-executed after an existing
    and unique Storage Account Name has already been created, or if a new and unique Storage Account Name
    is desired. If left blank, a new and unique Storage Account Name will be created for the Project. The
    format of the derived Storage Account Names is:
        $ProjectName (lowercase) + [Random lowercase letters and numbers] up to a total Length of 23


Other Requirements

  • An existing connection to an Azure subscription

  • The Upload of a VHD to a specified storage container mid-process. At this point in the process, the runbook will intentionally suspend and notify the user; after the upload, the user simply resumes the runbook and the rest of the creation process continues.

  • Six (6) Automation Assets (to be configured in the Assets tab). These are suggested, but not necessarily required. Replacing the “Get-AutomationVariable” calls within this runbook with static or parameter variables is an alternative method. For this example though, the following dependencies exist:
        VARIABLES SET WITH AUTOMATION ASSETS:
             $AGLocation = Get-AutomationVariable -Name ‘AGLocation’
             $GenericStorageContainerName = Get-AutomationVariable -Name ‘GenericStorageContainer’
             $SourceDiskFileExt = Get-AutomationVariable -Name ‘SourceDiskFileExt’
             $VMImageOS = Get-AutomationVariable -Name ‘VMImageOS’
             $AdminUsername = Get-AutomationVariable -Name ‘AdminUsername’
             $Password = Get-AutomationVariable -Name ‘Password’

Note     The entire runbook is heavily checkpointed and can be run multiple times without resource recreation.


Upload of a VHD

Waaaaait a minute! That seems like a pretty big step, how am I going to accomplish that?

I am so glad you asked.

To make this easier (for all of us), I created a separate PowerShell Workflow Script to take care of this step. In fact, it is the same one I used during the creation and testing of New-AzureEnvironmentResources.ps1.

Here it is (the contents of a file I called Upload-LocalVHDtoAzure.ps1):

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
param
(
    [Parameter(Mandatory=$true)]
    [string]$AzureSubscriptionName,
    [Parameter(Mandatory=$true)]
    [string]$ProjectName,
    [Parameter(Mandatory=$true)]
    [string]$StorageAccountName
)

workflow Upload-LocalVHDtoAzure { 

    param 
    ( 
        [string]$StorageContainerName, 
        [string]$VHDName, 
        [string]$SourceVHDPath, 
        [string]$DestinationBlobURI, 
        [bool]$OverWrite 
    ) 
    
    $AzureSubscriptionForWorkflow = Get-AzureSubscription 

    $AzureBlob = Get-AzureStorageBlob -Container $StorageContainerName -Blob $VHDName -ErrorAction SilentlyContinue 
    
    if(!$AzureBlob -or $OverWrite) { 

        $AzureBlob = Add-AzureVhd -LocalFilePath $SourceVHDPath -Destination $DestinationBlobURI -OverWrite:$OverWrite 
    } 

    Return $AzureBlob 

}

$GenericStorageContainerName = “vhds”

$SourceDiskName = “toWindowsAzure” 
$SourceDiskFileExt = “vhd” 
$SourceDiskPath = “D:\Drop\Azure\toAzure” 
$SourceVHDName = “{0}.{1}” -f $SourceDiskName,$SourceDiskFileExt 
$SourceVHDPath = “{0}\{1}” -f $SourceDiskPath,$SourceVHDName 

$DesitnationVHDName = “{0}.{1}” -f $ProjectName,$SourceDiskFileExt 
$DestinationVHDPath = https://{0}.blob.core.windows.net/{1}” -f $StorageAccountName,$GenericStorageContainerName 
$DestinationBlobURI = “{0}/{1}” -f $DestinationVHDPath,$DesitnationVHDName 
$OverWrite = $false 

Select-AzureSubscription -SubscriptionName $AzureSubscriptionName
Set-AzureSubscription -SubscriptionName $AzureSubscriptionName -CurrentStorageAccount $StorageAccountName

$AzureBlobUploadJob = Upload-LocalVHDtoAzure -StorageContainerName $GenericStorageContainerName -VHDName $DesitnationVHDName `
    -SourceVHDPath $SourceVHDPath -DestinationBlobURI $DestinationBlobURI -OverWrite $OverWrite -AsJob 
Receive-Job -Job $AzureBlobUploadJob -AutoRemoveJob -Wait -WriteEvents -WriteJobInResults

Note     This is just one method of uploading a VHD to Azure for a specified Storage Account. I have parameterized the entire script so it could be run from the command line as a PS1 file. Obviously you can do with this as you please.

 


Testing and Proof of Execution

I figured you might want to see the results of my testing during my development of the Provision Azure Environment Resources example…so here are some screen captures from the Azure Automation interface:

Dashboard

image

Runbooks

image

Assets

image

Azure All Items View

You know, to prove that I created something with these scripts…

image

Building Distributed Node.js Apps with Azure Service Bus Queue

Azure Service Bus Queues provides, a queue based, brokered messaging communication between apps, which lets developers build distributed apps on the Cloud and also for hybrid Cloud environments. Azure Service Bus Queue provides First In, First Out (FIFO) messaging infrastructure. Service Bus Queues can be leveraged for communicating between apps, whether the apps are hosted on the cloud or on-premises servers.

windows-azure-c3634

Service Bus Queues are primarily used for distributing application logic into multiple apps. For an example, let’s say, we are building an order processing app where we are building a frontend web app for receiving orders from customers and want to move order processing logic into a backend service where we can implement the order processing logic in an asynchronous manner. In this sample scenario, we can use Azure Service Bus Queue, where we can create an order processing message into Service Bus Queue from frontend app for processing the request. From the backend order processing app, we can receives the message from Queue and can process the request in an efficient manner. This approach also enables better scalability as we can separately scale-up frontend app and backend app.

For this sample scenario, we can deploy the frontend app onto Azure Web Role and backend app onto Azure Worker Role and can separately  scale-up both Web Role and Worker Role apps. We can also use Service Bus Queues for hybrid Cloud scenarios where we can communicate between apps hosted on Cloud and On-premises servers.    

Using Azure Service Bus Queues in Node.js Apps

In order to working with Azure Service Bus, we need to create a Service Bus namespace from Azure portal.

image

We can take the connection information of Service Bus namespace from the Connection Information tab in the bottom section, after choosing the Service Bus namespace.

image

Creating the Service Bus Client

Firstly, we need to install npm module azure to working with Azure services from Node.js app.

npm install azure

The code block below creates a Service Bus client object using the Node.js module azure.

var azure = require('azure');
var config=require('./config');

var serviceBusClient = azure.createServiceBusService(config.sbConnection);

We create the Service Bus client object by using createServiceBusService method of azure. In the above code block, we pass the Service Bus connection info from a config file. The azure module can also read the environment variables AZURE_SERVICEBUS_NAMESPACE and AZURE_SERVICEBUS_ACCESS_KEY for information required to connect with Azure Service Bus where we can call  createServiceBusService method without specifying the connection information.

Creating a Services Bus Queue

The createQueueIfNotExists method of Service Bus client object, returns the queue if it is already exists, or create a new Queue if it is not exists.

var azure = require('azure');
var config=require('./config');
var queue = 'ordersqueue';

var serviceBusClient = azure.createServiceBusService(config.sbConnection);

function createQueue() {
 serviceBusClient.createQueueIfNotExists(queue,
 function(error){
    if(error){
        console.log(error);
    }
    else
    {
        console.log('Queue ' + queue+ ' exists');
    }
});
}

Sending Messages to Services Bus Queue

The below function sendMessage sends a given message to Service Bus Queue

function sendMessage(message) {
    serviceBusClient.sendQueueMessage(queue,message,
 function(error) {
        if (error) {
            console.log(error);
        }
        else
        {
            console.log('Message sent to queue');
        }
    });
}

The following code create the queue and sending a message to Queue by calling the methods createQueue and sendMessage which we have created in the previous steps.

createQueue();
var orderMessage={
 "OrderId":101,
 "OrderDate": new Date().toDateString()
};
sendMessage(JSON.stringify(orderMessage));

We create a JSON object with properties OrderId and OrderDate and send this to the Service Bus Queue. We can send these messages to Queue for communicating with other apps where the receiver apps can read the messages from Queue and perform the application logic based on the messages we have provided.

Receiving Messages from Services Bus Queue

Typically, we will be receive the Service Bus Queue messages from a backend app. The code block below receives the messages from Service Bus Queue and extracting information from the JSON data.

var azure = require('azure');
var config=require('./config');
var queue = 'ordersqueue';

var serviceBusClient = azure.createServiceBusService(config.sbConnection);
function receiveMessages() {
    serviceBusClient.receiveQueueMessage(queue,
      function (error, message) {
        if (error) {
            console.log(error);
        } else {
            var message = JSON.parse(message.body);
            console.log('Processing Order# ' + message.OrderId
                + ' placed on ' + message.OrderDate);
        }
    });
}

By default, the messages will be deleted from Service Bus Queue after reading the messages. This behaviour can be changed by specifying the optional parameter isPeekLock as true as sown in the below code block.

function receiveMessages() {
    serviceBusClient.receiveQueueMessage(queue,{ isPeekLock: true },
      function (error, message) {
        if (error) {
            console.log(error);
        } else {
            var message = JSON.parse(message.body);
          console.log('Processing Order# ' + message.OrderId
                + ' placed on ' + message.OrderDate);
          serviceBusService.deleteMessage(message,
        function (deleteError){
            if(!deleteError){
                console.delete('Message deleted from Queue');
             }
           }
          });
        }
    });
}

Here the message will not be automatically deleted from Queue and we can explicitly delete the messages from Queue after reading and successfully implementing the application logic.

Architecture and Practical Application – BizTalk Adapter for mySAP Business Suite

Architecture for BizTalk Adapter for mySAP Business Suite

f36f4-netvjava

The Microsoft BizTalk Adapter for mySAP Business Suite implements a Windows Communication Foundation (WCF) custom binding, which contains a single custom transport binding element that enables communication with an SAP system.

biztalk-accelerator

The SAP adapter is wrapped by the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK runtime and is exposed to applications through the WCF channel architecture. The SAP adapter communicates with the SAP system through either the 64-bit or 32-bit version of the SAP Unicode RFC SDK (librfc32u.dll).

The following figure shows the end-to-end architecture for solutions that are developed by using the SAP adapter.
SAP End-to-End Architecture
Consuming the Adapter

The SAP adapter exposes the SAP system as a WCF service to client applications. Client applications exchange SOAP messages with the SAP adapter through WCF channels to perform operations and to access data on the SAP system. The preceding figure shows four ways in which the SAP adapter can be consumed.

image
• Through a WCF channel application that performs operations on the SAP system by using the WCF channel model to exchange SOAP messages directly with the SAP adapter. For more information about developing solutions for the SAP adapter by using WCF channel model programming, see Developing Applications by Using the WCF Channel Model.

• Through a WCF service model application that calls methods on a WCF client to perform operations on the SAP system. A WCF client models the operations exposed by the SAP adapter as .NET methods. You can use the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK or the svcutil.exe tool to create a WCF client class from metadata exposed by the SAP adapter. For more information about WCF service model programming and the SAP adapter, see Developing Applications by Using the WCF Service Model.

• Through a BizTalk port that is configured to use the BizTalk WCF-Custom adapter with the SAP Binding configured as the binding for the WCF-Custom transport type in a BizTalk Server application. The BizTalk WCF-Custom adapter enables communication between a BizTalk Server application and a WCF service.

The BizTalk WCF-Custom adapter supports custom WCF bindings through its WCF-Custom transport type, which enables you to configure any WCF binding exposed to the configuration system as the binding used by the BizTalk WCF-Custom adapter. For more information about how to use the SAP adapter in BizTalk Server solutions, see Developing BizTalk Applications. BizTalk transactions are supported by the BizTalk Layered Channel binding element which can be loaded by setting a binding property on the SAP Binding.

• Through an IIS-hosted Web service. In this scenario, the SAP adapter is exposed through a WCF Service proxy, which is hosted in IIS by using one of the standard WCF HTTP bindings.

• Through the .NET Framework Data Provider for mySAP Business Suite. The Data Provider for SAP runs on top of the SAP adapter and provides an ADO.NET interface to an SAP system.

The SAP adapter and the SAP RFC library are always hosted in-process with the application or service that consumes the adapter.

The SAP Adapter and WCF

WCF presents a programming model based on the exchange of SOAP messages over channels between clients and services. These messages are sent between endpoints exposed by a communicating client and service.

An endpoint consists of an endpoint address which specifies the location at which messages are received, a binding which specifies the communication protocols used to exchange messages, and a contract which specifies the operations and data types exposed by the endpoint.

A binding consists of one or more binding elements that stack on top of each other to define how messages are exchanged with the endpoint.

 

At a minimum, a binding must specify the transport and encoding used to exchange messages with the endpoint. Message exchange between endpoints occurs over a channel stack that is composed of one or more channels. Each channel is a concrete implementation of one of the binding elements in the binding configured for the endpoint.

For more information about WCF and the WCF programming model, see “Windows Communication Foundation” at http://go.microsoft.com/fwlink/?LinkId=89726.

The Microsoft BizTalk Adapter for mySAP Business Suite exposes a WCF custom binding, the SAP Binding (Microsoft.Adapters.SAP.SAPBinding). By default, this binding contains a single custom transport binding element, the SAP Adapter Binding Element (Microsoft.Adapters.SAP.SAPAdapter), which enables operations on an SAP system. When using the SAP adapter with BizTalk Server, you can set the EnableBizTalkCompatibilityMode binding property to load a custom binding element, the BizTalk Layered Channel Binding Element, on top of the SAP Adapter Binding Element. The BizTalk Layered Channel Binding Element is implemented internally by the SAP adapter and is not exposed outside the SAP Binding.

Microsoft.Adapters.SAP.SAPBinding (the SAP Binding) and Microsoft.Adapters.SAP.SAPAdapter (the SAP Adapter Binding Element) are public classes and are also exposed to the configuration system. Because the SAP Adapter Binding Element is exposed publicly, you can build your own custom WCF bindings capable of extending the functionality of the SAP adapter. For example, you could implement a custom binding to support Enterprise Single Sign-On (SSO) in a WCF channel or a WCF service model programming solution, to aggregate database operations into a single multifunction operation, or to perform schema transformation between operations implemented by a custom application and operations on the SAP system.

The SAP adapter is built on top of the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK and runs on top of the WCF LOB Adapter SDK runtime. The WCF LOB Adapter SDK provides a software framework and tooling infrastructure that the SAP adapter leverages to provide a rich set of features to users and adapter clients.

The Connection to the SAP System

The SAP adapter connects with the SAP system through the SAP Unicode RFC SDK Library (librfc32u.dll). The SAP adapter supports both the 32 bit and the 64 bit versions of the SAP RFC SDK. The SAP RFC SDK enables external programs to call ABAP functions on a SAP system.

You establish a connection to an SAP system by providing a connection URI to the SAP adapter. The SAP adapter supports the following kinds of connections to an SAP system:
• An application host–based connection (A), in which the SAP adapter connects directly to an SAP application server.

• A load balancing connection (B), in which the SAP adapter connects to an SAP messaging server.

• A destination-based connection (D), in which the connection to the SAP system is specified by a destination in the saprfc.ini configuration file. A, B, and R type connections are supported.

• A listener connection (R), in which the adapter receives RFCs, tRFC and IDOCs through an RFC Destination on the SAP system that is specified by a listener host, a listener gateway service, and a listener program ID, either directly in the connection URI or by an R-based destination in the saprfc.ini configuration file.

Architecture for BizTalk Adapter for mySAP Business Suite

This topic has not yet been rated – Rate this topic

The Microsoft BizTalk Adapter for mySAP Business Suite implements a Windows Communication Foundation (WCF) custom binding, which contains a single custom transport binding element that enables communication with an SAP system. The SAP adapter is wrapped by the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK runtime and is exposed to applications through the WCF channel architecture. The SAP adapter communicates with the SAP system through either the 64-bit or 32-bit version of the SAP Unicode RFC SDK (librfc32u.dll). The following figure shows the end-to-end architecture for solutions that are developed by using the SAP adapter.
SAP End-to-End Architecture
Consuming the Adapter

The SAP adapter exposes the SAP system as a WCF service to client applications. Client applications exchange SOAP messages with the SAP adapter through WCF channels to perform operations and to access data on the SAP system. The preceding figure shows four ways in which the SAP adapter can be consumed.
• Through a WCF channel application that performs operations on the SAP system by using the WCF channel model to exchange SOAP messages directly with the SAP adapter. For more information about developing solutions for the SAP adapter by using WCF channel model programming, see Developing Applications by Using the WCF Channel Model.

• Through a WCF service model application that calls methods on a WCF client to perform operations on the SAP system. A WCF client models the operations exposed by the SAP adapter as .NET methods. You can use the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK or the svcutil.exe tool to create a WCF client class from metadata exposed by the SAP adapter. For more information about WCF service model programming and the SAP adapter, see Developing Applications by Using the WCF Service Model.

• Through a BizTalk port that is configured to use the BizTalk WCF-Custom adapter with the SAP Binding configured as the binding for the WCF-Custom transport type in a BizTalk Server application. The BizTalk WCF-Custom adapter enables communication between a BizTalk Server application and a WCF service.

The BizTalk WCF-Custom adapter supports custom WCF bindings through its WCF-Custom transport type, which enables you to configure any WCF binding exposed to the configuration system as the binding used by the BizTalk WCF-Custom adapter. For more information about how to use the SAP adapter in BizTalk Server solutions, see Developing BizTalk Applications.

BizTalk transactions are supported by the BizTalk Layered Channel binding element which can be loaded by setting a binding property on the SAP Binding.

• Through an IIS-hosted Web service. In this scenario, the SAP adapter is exposed through a WCF Service proxy, which is hosted in IIS by using one of the standard WCF HTTP bindings.

• Through the .NET Framework Data Provider for mySAP Business Suite. The Data Provider for SAP runs on top of the SAP adapter and provides an ADO.NET interface to an SAP system.

The SAP adapter and the SAP RFC library are always hosted in-process with the application or service that consumes the adapter.

The SAP Adapter and WCF

WCF presents a programming model based on the exchange of SOAP messages over channels between clients and services. These messages are sent between endpoints exposed by a communicating client and service.

An endpoint consists of an endpoint address which specifies the location at which messages are received, a binding which specifies the communication protocols used to exchange messages, and a contract which specifies the operations and data types exposed by the endpoint. A binding consists of one or more binding elements that stack on top of each other to define how messages are exchanged with the endpoint.

At a minimum, a binding must specify the transport and encoding used to exchange messages with the endpoint. Message exchange between endpoints occurs over a channel stack that is composed of one or more channels. Each channel is a concrete implementation of one of the binding elements in the binding configured for the endpoint.

For more information about WCF and the WCF programming model, see “Windows Communication Foundation” at http://go.microsoft.com/fwlink/?LinkId=89726.

The Microsoft BizTalk Adapter for mySAP Business Suite exposes a WCF custom binding, the SAP Binding (Microsoft.Adapters.SAP.SAPBinding). By default, this binding contains a single custom transport binding element, the SAP Adapter Binding Element (Microsoft.Adapters.SAP.SAPAdapter), which enables operations on an SAP system. When using the SAP adapter with BizTalk Server, you can set the EnableBizTalkCompatibilityMode binding property to load a custom binding element, the BizTalk Layered Channel Binding Element, on top of the SAP Adapter Binding Element. The BizTalk Layered Channel Binding Element is implemented internally by the SAP adapter and is not exposed outside the SAP Binding.

Microsoft.Adapters.SAP.SAPBinding (the SAP Binding) and Microsoft.Adapters.SAP.SAPAdapter (the SAP Adapter Binding Element) are public classes and are also exposed to the configuration system. Because the SAP Adapter Binding Element is exposed publicly, you can build your own custom WCF bindings capable of extending the functionality of the SAP adapter. For example, you could implement a custom binding to support Enterprise Single Sign-On (SSO) in a WCF channel or a WCF service model programming solution, to aggregate database operations into a single multifunction operation, or to perform schema transformation between operations implemented by a custom application and operations on the SAP system.

The SAP adapter is built on top of the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK and runs on top of the WCF LOB Adapter SDK runtime. The WCF LOB Adapter SDK provides a software framework and tooling infrastructure that the SAP adapter leverages to provide a rich set of features to users and adapter clients.

The Connection to the SAP System

The SAP adapter connects with the SAP system through the SAP Unicode RFC SDK Library (librfc32u.dll). The SAP adapter supports both the 32 bit and the 64 bit versions of the SAP RFC SDK. The SAP RFC SDK enables external programs to call ABAP functions on a SAP system.

You establish a connection to an SAP system by providing a connection URI to the SAP adapter. The SAP adapter supports the following kinds of connections to an SAP system:
• An application host–based connection (A), in which the SAP adapter connects directly to an SAP application server.

• A load balancing connection (B), in which the SAP adapter connects to an SAP messaging server.

• A destination-based connection (D), in which the connection to the SAP system is specified by a destination in the saprfc.ini configuration file. A, B, and R type connections are supported.

• A listener connection (R), in which the adapter receives RFCs, tRFC and IDOCs through an RFC Destination on the SAP system that is specified by a listener host, a listener gateway service, and a listener program ID, either directly in the connection URI or by an R-based destination in the saprfc.ini configuration file.

So – How Do I Use a Custom Web Part?

This topic has not yet been rated – Rate this topic

This section provides information about using a custom Web Part with Microsoft Office SharePoint Server. To use a custom Web Part, you must do the following:
1. Create a custom Web Part

  1. Deploy the custom Web Part to a SharePoint portal
  2. Configure the SharePoint portal to use the custom Web Part

Before You Begin

Before you create a custom Web Part:
• Publish the SAP artifacts as a WCF service. For more information, see Step 1: Publish the SAP Artifacts as a WCF Service in Tutorial 1: Presenting Data from an SAP System on a SharePoint Site.

• Create an application definition file for the SAP artifacts using the Business Data Catalog in Microsoft Office SharePoint Server. For more information, see Step 2: Create an Application Definition File for the SAP Artifacts in Tutorial 1: Presenting Data from an SAP System on a SharePoint Site.

Step 1: Create a custom Web Part

To create a custom Web Part using Visual Studio, do the following:
1. Start Visual Studio, and then create a project.

  1. In the New Project dialog box, from the Project types pane, select Visual C#. From the Templates pane, select Class Library.
  • Specify a name and location for the solution. For this topic, specify CustomWebPart in the Name and Solution Name boxes. Specify a location, and then click OK.

  • Add a reference to the System.Web component into the project. Right-click the project name in Solution Explorer, and then click Add Reference. In the Add Reference dialog box, select System.Web in the .NET tab, and then click OK. The System.Web component contains the required namespace of System.Web.UI.WebControls.WebParts.

  • Add the required code based on your issue in the project. For the code sample that is relevant to a certain issue, see “Issues Involving Custom Web Parts” in Considerations While Using the SAP Adapter with Microsoft Office SharePoint Server.

  • Build the project. On successful build of the project, a .dll file, CustomWebPart.dll, will be generated in the /bin/Debug folder.

  • Only for 64-bit computer: Sign the CustomWebPart.dll file with a strong name before performing the following steps. Otherwise, you will not be able to import, and hence use the CustomWebPart.dll in the SharePoint portal in “Step 3: Configure the SharePoint Portal to use the custom Web Part.” For information about how to sign an assembly with a strong name, see http://go.microsoft.com/fwlink/?LinkId=197171.

  • Step 2: Deploy the custom Web Part to a SharePoint Portal

    You must do the following to make the CustomWebPart.dll file (custom Web Part) that is created in “Step 1: Create a custom Web Part” of this topic usable on the SharePoint portal:
    • Copy the CustomWebPart.dll file to the bin folder of the SharePoint Portal: Microsoft Office SharePoint Server creates portals under the :\Inetpub\wwwroot\wss\VirtualDirectories folder. A folder is created for each portal, and can be identified with the port number. You must copy the CustomWebPart.dll file created in “Step 1: Create a custom Web Part” of this topic to the :\Inetpub\wwwroot\wss\VirtualDirectories\bin folder. For example, if the port number of your SharePoint portal is 13614, you must copy the CustomWebPart.dll file to the :\Inetpub\wwwroot\wss\VirtualDirectories\13614\bin folder.

    TipTip

    Another way to find the folder location of your SharePoint portal is by using the Internet Information Services (IIS) Manager window (Start > Run > inetmgr). Locate your SharePoint portal in the Internet Information Services (IIS) Manager window ([computer_name] > Web Sites > [Portal-Name]), right-click, and then click Properties in the shortcut menu. In the properties dialog box of the SharePoint portal, click the Home Directory tab, and then select the Local path box.

    • Add the Safe Control Entry in the web.config File: Because the CustomWebPart.dll file will be used on different computers and by multiple users, you must declare the file as “safe.” To do so, open the web.config file located in the SharePoint portal folder at :\Inetpub\wwwroot\wss\VirtualDirectories. Under the section of the web.config file, add the following safe control entry:

    ◦On 32-bit computer:

    Copy

     

    ◦On 64-bit computer:

    Copy

     

    Save the web.config file, and then close it.

    Step 3: Configure the SharePoint portal to use the custom Web Part

    You need to add the custom Web Part to the Microsoft Office SharePoint Server Web Part Gallery, so that you can use it on your SharePoint portal. To do so:

    1. Start SharePoint 3.0 Central Administration. Click Start, point to All Programs, point to Microsoft Office Server, and then click SharePoint 3.0 Central Administration.
  • In the left navigation pane, click the name of the Shared Service Provider (SSP) to which you want to add the custom Web Part.

  • On the Shared Services Administration page, in the upper-right corner, click Site Actions, and then click Create.

  • On the Site Settings page, click Web Parts under the Galleries column.

  • On the Web Part Gallery page, to add the custom Web Part to the gallery, click New. At this point the custom Web Part is not available in the Web Part Gallery page.

  • On the New Web Parts page, locate CustomWebPart (name of the custom Web Part) in the list, select the check box on the left, and then click Populate Gallery on the top of the page. This will add the CustomWebPart entry in the Web Part Gallery page.

  • Now you can use the custom Web Part (CustomWebPart) to create Web Parts in your SharePoint portal. The custom Web Part (CustomWebPart) will appear under the Miscellaneous section in the Add Web Parts page.

     

    Expand

    BizTalk Adapter for mySAP Business Suite and the WCF LOB Adapter SDK

    This topic has not yet been rated Rate this topic

    The Microsoft BizTalk Adapter for mySAP Business Suite implements a set of core components that leverage functionality provided by the Microsoft Windows Communication Foundation (WCF) Line of Business (LOB) Adapter SDK and provide connectivity to the SAP system through the SAP Unicode RFC SDK Library (librfc32u.dll).

    The WCF LOB Adapter SDK serves as the software layer through which the SAP adapter interfaces with Windows Communication Foundation (WCF), and the RFC SDK serves as the layer through which the SAP adapter interfaces with the SAP system.

    The following figure shows the relationships between the internal components of the SAP adapter and between these components and the RFC SDK.

    The relationship of internal adapter components

    See

     

    SAP Weekend : Part 2 – Using the Microsoft BizTalk Server for B2B Integration with SharePoint

    This is Part 2 of my past weekend’s activities with SharePoint and SAP Integration methods.

     

    In this post I am looking at how to use the BizTalk Adapter with SharePoint

     

    Topics

    • Abstract
    • Goal
    • Business Scenario
    • Environment
    • Document Flow
    • Integration Steps
    • .NET Support
    • Summary

     

    Abstract

    In the past few years, the whole perspective of doing business has been moved towards implementing Enterprise Resource Planning Systems for the key areas like marketing, sales and manufacturing operations. Today most of the large organizations which deal with all major world markets, heavily rely on such key areas.

    Operational Systems of any organization can be achieved from its worldwide network of marketing teams as well as from manufacturing and distribution techniques. In order to provide customers with realistic information, each of these systems need to be integrated as part of the larger enterprise.

    This ultimately results into efficient enterprise overall, providing more reliable information and better customer service. This paper addresses the integration of Biztalk Server and Enterprise Resource Planning System and the need for their integration and their role in the current E-Business scenario.

     

    Goal

    There are several key business drivers like customers and partners that need to communicate on different fronts for successful business relationship. To achieve this communication, various systems need to get integrated that lead to evaluate and develop B2B Integration Capability and E–Business strategy. This improves the quality of business information at its disposal—to improve delivery times, costs, and offer customers a higher level of overall service.

    To provide B2B capabilities, there is a need to give access to the business application data, providing partners with the ability to execute global business transactions. Facing internal integration and business–to–business (B2B) challenges on a global scale, organization needs to look for required solution.

    To integrate the worldwide marketing, manufacturing and distribution facilities based on core ERP with variety of information systems, organization needs to come up with strategic deployment of integration technology products and integration service capabilities.

     

    Business Scenario

    Now take the example of this ABC Manufacturing Company: whose success is the strength of its European-wide trading relationships. Company recognizes the need to strengthen these relationships by processing orders faster and more efficiently than ever before.

    The company needed a new platform that could integrate orders from several countries, accepting payments in multiple currencies and translating measurements according to each country’s standards. Now, the bottom line for ABC’s e-strategy was to accelerate order processing. To achieve this: the basic necessity was to eliminate the multiple collections of data and the use of invalid data.

    By using less paper, ABC would cut processing costs and speed up the information flow. Keeping this long term goal in mind, ABC Manufacturing Company can now think of integrating its four key countries into a new business-to-business (B2B) platform.

     

    Here is another example of this XYZ Marketing Company. Users visit on this company’s website to explore a variety of products for its thousands of customers all over the world. Now this company always understood that they could offer greater benefits to customers if they could more efficiently integrate their customers’ back-end systems. With such integration, customers could enjoy the advantages of highly efficient e-commerce sites, where a visitor on the Web could place an order that would flow smoothly from the website to the customer’s order entry system.

     

    Some of those back-end order entry systems are built on the latest, most sophisticated enterprise resource planning (ERP) system on the market, while others are built on legacy systems that have never been upgraded. Different customers requires information formatted in different ways, but XYZ has no elegant way to transform the information coming out of website to meet customer needs. With the traditional approach:

    For each new e-commerce customer on the site, XYZ’s staff needs to work for significant amounts of time creating a transformation application that would facilitate the exchange of information. But with better approach: XYZ needs a robust messaging solution that would provide the flexibility and agility to meet a range of customer needs quickly and effectively. Now again XYZ can think of integrating Customer Backend Systems with the help of business-to-business (B2B) platform.

     

    Environment

    Many large scale organizations maintain a centralized SAP environment as its core enterprise resource planning (ERP) system. The SAP system is used for the management and processing of all global business processes and practices. B2B integration mainly relies on the asynchronous messaging, Electronic Data Interchange (EDI) and XML document transformation mechanisms to facilitate the transformation and exchange of information between any ERP System and other applications including legacy systems.

    For business document routing, transformation, and tracking, existing SAP-XML/EDI technology road map needs XML service engine. This will allow development of complex set of mappings from and to SAP to meet internal and external XML/EDI technology and business strategy. Microsoft BizTalk Server is the best choice to handle the data interchange and mapping requirements. BizTalk Server has the most comprehensive development and management support among business-to-business platforms. Microsoft BizTalk Server and BizTalk XML Framework version 2.0 with Simple Object Access Protocol (SOAP) version 1.1 provide precisely the kind of messaging solution that is needed to facilitate integration with cost effective manner.

     

    Document Flow

    Friends, now let’s look at the actual flow of document from Source System to Customer Target System using BizTalk Server. When a document is created, it is sent to a TCP/IP-based Application Linking and Enabling (ALE) port—a BizTalk-based receive function that is used for XML conversion. Then the document passes the XML to a processing script (VBScript) that is running as a BizTalk Application Integration Component (AIC). The following figure shows how BizTalk Server acts as a hub between applications that reside in two different organizations:

    The data is serialized to the customer/vendor XML format using the Extensible Stylesheet Language Transformations (XSLT) generated from the BizTalk Mapper using a BizTalk channel. The XML document is sent using synchronous Hypertext Transfer Protocol Secure (HTTPS) or another requested transport protocol such as the Simple Mail Transfer Protocol (SMTP), as specified by the customer.

    The following figure shows steps for XML document transformation:

    The total serialized XML result is passed back to the processing script that is running as a BizTalk AIC. An XML “receipt” document then is created and submitted to another BizTalk channel that serializes the XML status document into a SAP IDOC status message. Finally, a Remote Function Call (RFC) is triggered to the SAP instance/client using a compiled C++/VB program to update the SAP IDOC status record. A complete loop of document reconciliation is achieved. If the status is not successful, an e-mail message is created and sent to one of the Support Teams that own the customer/vendor business XML/EDI transactions so that the conflict can be resolved. All of this happens instantaneously in a completely event-driven infrastructure between SAP and BizTalk.

    Integration Steps

    Let’s talk about a very popular Order Entry and tracking scenario while discussing integration hereafter. The following sections describe the high-level steps required to transmit order information from Order Processing pipeline Component into the SAP/R3 application, and to receive order status update information from the SAP/R3 application.

    The integration of AFS purchase order reception with SAP is achieved using the BizTalk Adapter for SAP (BTS-SAP). The IDOC handler is used by the BizTalk Adapter to provide the transactional support for bridging tRFC (Transactional Remote Function Calls) to MSMQ DTC (Distributed Transaction Coordinator). The IDOC handler is a COM object that processes IDOC documents sent from SAP through the Com4ABAP service, and ensures their successful arrival at the appropriate MSMQ destination. The handler supports the methods defined by the SAP tRFC protocol. When integrating purchase order reception with the SAP/R3 application, BizTalk Server (BTS) provides the transformation and messaging functionality, and the BizTalk Adapter for SAP provides the transport and routing functionality.

    The following two sequential steps indicate how the whole integration takes place:

    • Purchase order reception integration
    • Order Status Update Integration

    Purchase Order Reception Integration

    1. Suppose a new pipeline component is added to the Order Processing pipeline. This component creates an XML document that is equivalent to the OrderForm object that is passed through the pipeline. This XML purchase order is in Commerce Server Order XML v1.0 format, and once created, is sent to a special Microsoft Message Queue (MSMQ) queue created specifically for this purpose.Writing the order from the pipeline to MSMQ:>

      The first step in sending order data to the SAP/R3 application involves building a new pipeline component to run within the Order Processing pipeline. This component must perform the following two tasks:

      A] Make an XML-formatted copy of the OrderForm object that is passing through the order processing pipeline. The GenerateXMLForDictionaryUsingSchema method of the DictionaryXMLTransforms object is used to create the copy.

      Private Function IPipelineComponent_Execute(ByVal objOrderForm As Object, _
          ByVal objContext As Object, ByVal lFlags As Long) As Long
      
      On Error GoTo ERROR_Execute
      
      Dim oXMLTransforms As Object
      Dim oXMLSchema As Object
      Dim oOrderFormXML As Object
      
      ' Return 1 for Success.
      IPipelineComponent_Execute = 1
      
      ' Create a DictionaryXMLTransforms object.
      Set oXMLTransforms = CreateObject("Commerce.DictionaryXMLTransforms")
      
      ' Create a PO schema object.
      Set oXMLSchema = oXMLTransforms.GetXMLFromFile(sSchemaLocation)
      
      ' Create an XML version of the order form.
      Set oOrderFormXML = oXMLTransforms.GenerateXMLForDictionaryUsingSchema_
          (objOrderForm, oXMLSchema)
      
      WritePO2MSMQ sQueueName, oOrderFormXML.xml, PO_TO_ERP_QUEUE_LABEL, _
          sBTSServerName, AFS_PO_MAXTIMETOREACHQUEUE
      
      Exit Function
      
      ERROR_Execute:
      App.LogEvent "QueuePO.CQueuePO -> Execute Error: " & _
      vbCrLf & Err.Description, vbLogEventTypeError
      
      ' Set warning level.
      IPipelineComponent_Execute = 2
      Resume Next
      
      End Function

      B] Send the newly created XML order document to the MSMQ queue defined for this purpose.

      Option Explicit
      
      ' MSMQ constants.
      
      ' Access modes.
      Const MQ_RECEIVE_ACCESS = 1
      Const MQ_SEND_ACCESS = 2
      Const MQ_PEEK_ACCESS = 32
      
      ' Sharing modes. Const MQ_DENY_NONE = 0
      Const MQ_DENY_RECEIVE_SHARE = 1
      
      ' Transaction options. Const MQ_NO_TRANSACTION = 0
      Const MQ_MTS_TRANSACTION = 1
      Const MQ_XA_TRANSACTION = 2
      Const MQ_SINGLE_MESSAGE = 3
      
      ' Error messages.
      Const MQ_ERROR_QUEUE_NOT_EXIST = -1072824317
      
      ' MQ Message ACKNOWLEDGEMENT.
      Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
      Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
      Const DEFAULT_MAX_TIME_TO_REACH_QUEUE = 20
      ' MQ Message ACKNOWLEDGEMENT.
      Const MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE = 5
      Const MQMSG_ACKNOWLEDGMENT_FULL_RECEIVE = 14
      
      Function WritePO2MSMQ(sQueueName As String, sMsgBody As String, _
          sMsgLabel As String, sServerName As String, _
          Optional MaxTimeToReachQueue As Variant) As Long
      
      Dim lMaxTime As Long
      
      If IsMissing(MaxTimeToReachQueue) Then
      lMaxTime = DEFAULT_MAX_TIME_TO_REACH_QUEUE
      Else
      lMaxTime = MaxTimeToReachQueue
      End If
      
      Dim objQueueInfo As MSMQ.MSMQQueueInfo
      Dim objQueue As MSMQ.MSMQQueue, objAdminQueue As MSMQ.MSMQQueue
      Dim objQueueMsg As MSMQ.MSMQMessage
      
      On Error GoTo MSMQ_Error
      
      Set objQueueInfo = New MSMQ.MSMQQueueInfo
      objQueueInfo.FormatName = "DIRECT=OS:" & sServerName & "\PRIVATE$\" & sQueueName
      
      Set objQueue = objQueueInfo.Open(MQ_SEND_ACCESS, MQ_DENY_NONE)
      
      Set objQueueMsg = New MSMQ.MSMQMessage
      
      objQueueMsg.Label = sMsgLabel ' Set the message label property
      objQueueMsg.Body = sMsgBody ' Set the message body property
      objQueueMsg.Ack = MQMSG_ACKNOWLEDGMENT_FULL_REACH_QUEUE
      objQueueMsg.MaxTimeToReachQueue = lMaxTime
      
      objQueueMsg.send objQueue, MQ_SINGLE_MESSAGE
      
      objQueue.Close
      
      On Error Resume Next
      Set objQueueMsg = Nothing
      Set objQueue = Nothing
      Set objQueueInfo = Nothing
      
      Exit Function
      
      MSMQ_Error:
      App.LogEvent "Error in WritePO2MSMQ: " & Error
      Resume Next
      
      End Function
      
    2. A BTS MSMQ receive function picks up the document from the MSMQ queue and sends it to a BTS channel that has been configured for this purpose. Receiving the XML order from MSMQ: The second step in sending order data to the SAP/R3 application involves BTS receiving the order data from the MSMQ queue into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the MSMQ queue to which the XML order was sent in the previous step. This receive function forwards the XML message to the configured BTS channel for transformation.
    3. The third step in sending order data to the SAP/R3 application involves BTS transforming the order data from Commerce Server Order XML v1.0 format into ORDERS01 IDOC format. A BTS channel must be configured to perform this transformation. After the transformation is complete, the BTS channel sends the resulting ORDERS01 IDOC message to the corresponding BTS messaging port. The BTS messaging port is configured to send the transformed message to an MSMQ queue called the 840 Queue. Once the message is placed in this queue, the BizTalk Adapter for SAP is responsible for further processing. 
    4. BizTalk Adapter for SAP sends the ORDERS01document to the DCOM Connector (Get more information on DCOM Connector from www.sap.com/bapi), which writes the order to the SAP/R3 application. The DCOM Connector is an SAP software product that provides a mechanism to send data to, and receive data from, an SAP system. When an IDOC message is placed in the 840 Queue, the DOM Connector retrieves the message and sends it to SAP for processing. Although this processing is in the domain of the BizTalk Adapter for SAP, the steps involved are reviewed here as background information:
      • Determine the version of the IDOC schema in use and generate a BizTalk Server document specification.
      • Create a routing key from the contents of the Control Record of the IDOC schema.
      • Request a SAP Destination from the Manager Data Store given the constructed routing key.
      • Submit the IDOC message to the SAP System using the DCOM Connector 4.6D Submit functionality.

    Order Status Update Integration

    Order status update integration can be achieved by providing a mechanism for sending information about updates made within the SAP/R3 application back to the Commerce Server order system.

    The following sequence of steps describes such a mechanism:

    1. BizTalk Adapter for SAP processing:
      After a user has updated a purchase order using the SAP client, and the IDOC has been submitted to the appropriate tRFC port, the BizTalk Adapter for SAP uses the DCOM connector to send the resulting information to the 840 Queue, packaged as an ORDERS01 IDOC message. The 840 Queue is an MSMQ queue into which the BizTalk Adapter for SAP places IDOC messages so that they can be retrieved and processed by interested parties. This process is within the domain of the BizTalk Adapter for SAP, and is used by this solution to achieve the order update integration.
    2. Receiving the ORDERS01 IDOC message from MSMQ:
      The second step in updating order status from the SAP/R3 application involves BTS receiving ORDERS01 IDOC message from the MSMQ queue (840 Queue) into which it was placed at the end of the first step. You must configure a BTS MSMQ receive function to monitor the 840 Queue into which the XML order status message was placed. This receive function must be configured to forward the XML message to the configured BTS channel for transformation.
    3. Transforming the order update from IDOC format:
      Using a BTS MSMQ receive function, the document is retrieved and passed to a BTS transformation channel. The BTS channel transforms the ORDERS01 IDOC message into Commerce Server Order XML v1.0 format, and then forwards it to the corresponding BTS messaging port. You must configure a BTS channel to perform this transformation.The following BizTalk Server (BTS) map demonstrates in the prototyping of this solution for transforming an SAP ORDERS01 IDOC message into an XML document in Commerce Server Order XML v1.0 format. It allows a change to an order in the SAP/R3 application to be reflected in the Commerce Server orders database.

      This map used in the prototype only maps the order ID, demonstrating how the order in the SAP/R3 application can be synchronized with the order in the Commerce Server orders database. The mapping of other fields is specific to a particular implementation, and was not done for the prototype.

    < xsl:stylesheet xmlns:xsl='http://www.w3.org/1999/XSL/Transform' 
    xmlns:msxsl='urn:schemas-microsoft-com:xslt' xmlns:var='urn:var' 
    xmlns:user='urn:user' exclude-result-prefixes='msxsl var user' 
    version='1.0'>
    < xsl:output method='xml' omit-xml-declaration='yes' />
    < xsl:template match='/'>
    < xsl:apply-templates select='ORDERS01'/>
    < /xsl:template>
    < xsl:template match='ORDERS01'>
    < orderform>
    
    'Connection from source node "BELNR" to destination node "OrderID"
    
    < xsl:if test='E2EDK02/@BELNR'>
    < xsl:attribute name='OrderID'>
    ; < xsl:value-of select='E2EDK02/@BELNR'/>
    < /xsl:attribute>
    < /xsl:if>
    < /orderform>
    < /xsl:template>
    < /xsl:stylesheet>

    The BTS message port posts the transformed order update document to the configured ASP page for further processing. The configured ASP page retrieves the message posted to it and uses the Commerce Server OrderGroupManager and OrderGroup objects to update the order status information in the Commerce Server orders database.

  • Updating the Commerce Server order system:
    The fourth step in updating order status from the SAP/R3 application involves updating the Commerce Server order system to reflect the change in status. This is accomplished by adding the page _OrderStatusUpdate.asp to the AFS Solution Site and configuring the BTS messaging port to post the transformed XML document to that page. The update is performed using the Commerce Server OrderGroupManager and OrderGroup objects.
  •  

    The routine ProcessOrderStatus is the primary routine in the page. It uses the DOM and XPath to extract enough information to find the appropriate order using the OrderGroupManager object. Once the correct order is located, it is loaded into an OrderGroup object so that any of the entries in the OrderGroup object can be updated as needed.

    The following code implements page _OrderStatusUpdate.asp:

    < %@ Language="VBScript" %>
    
    < % 
    const TEMPORARY_FOLDER = 2
    
    call Main()
    
    Sub Main()
    call ProcessOrderStatus( ParseRequestForm() )
    End Sub
    
    Sub ProcessOrderStatus(sDocument)
    
    Dim oOrderGroupMgr 
    Dim oOrderGroup 
    Dim rs
    Dim sPONum
    Dim oAttr 
    Dim vResult
    Dim vTracking 
    Dim oXML
    Dim dictConfig
    Dim oElement
    
    Set oOrderGroupMgr = Server.CreateObject("CS_Req.OrderGroupManager")
    Set oOrderGroup = Server.CreateObject("CS_Req.OrderGroup")
    
    Set oXML = Server.CreateObject("MSXML.DOMDocument")
    oXML.async = False
    
    If oXML.loadXML (sDocument) Then
    
    ' Get the orderform element.
    Set oElement = oXML.selectSingleNode("/orderform")
    
    ' Get the poNum.
    sPONum = oElement.getAttribute("OrderID")
    
    Set dictConfig = Application("MSCSAppConfig").GetOptionsDictionary("")
    
    ' Use ordergroupmgr to find the order by OrderID.
    oOrderGroupMgr.Initialize (dictConfig.s_CatalogConnectionString)
    Set rs = oOrderGroupMgr.Find(Array("order_requisition_number='" sPONum & "'"), _
        Array(""), Array(""))
    
    If rs.EOF And rs.BOF Then
    'Create a new one. - Not implemented in this version.
    Else
    ' Edit the current one.
    oOrderGroup.Initialize dictConfig.s_CatalogConnectionString, rs("User_ID")
    
    ' Load the found order.
    oOrderGroup.LoadOrder rs("ordergroup_id")
    
    ' For the purposes of prototype, we only update the status
    oOrderGroup.Value.order_status_code = 2 ' 2 = Saved order
    
    ' Save it
    vResult = oOrderGroup.SaveAsOrder(vTracking)
    
    End If
    Else
    WriteError "Unable to load received XML into DOM."
    End If
    
    End Sub Function ParseRequestForm()
    
    Dim PostedDocument
    Dim ContentType
    Dim CharSet
    Dim EntityBody
    Dim Stream
    Dim StartPos
    Dim EndPos
    
    ContentType = Request.ServerVariables( "CONTENT_TYPE" )
    
    ' Determine request entity body character set (default to us-ascii).
    CharSet = "us-ascii"
    StartPos = InStr( 1, ContentType, "CharSet=""", 1)
    If (StartPos > 0 ) then
    StartPos = StartPos + Len("CharSet=""")
    EndPos = InStr( StartPos, ContentType, """",1 )
    CharSet = Mid (ContentType, StartPos, EndPos - StartPos )
    End If
    
    ' Check for multipart MIME message.
    PostedDocument = ""
    
    if ( ContentType = "" or Request.TotalBytes = 0) then
    
    ' Content-Type is required as well as an entity body.
    Response.Status = "406 Not Acceptable"
    Response.Write "Content-type or Entity body is missing" & VbCrlf
    Response.Write "Message headers follow below:" & VbCrlf
    Response.Write Request.ServerVariables("ALL_RAW") & VbCrlf
    Response.End
    Else
    If ( InStr( 1,ContentType,"multipart/" ) >

    .NET Support

    This Multi-Tier Application Environment can be implemented successfully with the help of Web portal which utilizes the Microsoft .NET Enterprise Server model. The Microsoft BizTalk Server Toolkit for Microsoft .NET provides the ability to leverage the power of XML Web services and Visual Studio .NET to build dynamic, transaction-based, fault-tolerant systems with full access to existing applications.

    Summary

    Microsoft BizTalk Server can help organizations quickly establish and manage Internet relationships with other organizations. It makes it possible for them to automate document interchange with any other organization, regardless of the conversion requirements and data formats used. This provides a cost-effective approach for integrating business processes across large Enterprises Resource Planning Systems. Integration process designed to facilitate collaborative e-commerce business processes. The process includes a document interchange engine, a business process execution engine, and a set of business document and server management tools. In addition, a business document editor and mapper tools are provided for managing trading partner relationships, administering server clusters, and tracking transactions.

    References

    All my Web Parts and Apps are now making use of Knockout.JS !! Template also available at very low price!!

    After completing the development of my latest Web Part, the “List Search” Web Part I decided to update all my Web Parts and Apps to using Knockout.JS, starting with the “List Search” Web Part.

    This topic came up when we I looked at some of my older products that includes generic list and library web parts, that would display few common fields like ID, Title, Description, File Url etc. Prior to this request we solved similar issues with OOB list and library web parts with custom XSLT, by creating Visual Studio web part for branding purposes only, or by using Imtech content query web part( which is XSLT solution by design).

    At the end, clients hated XSLT solutions and I hated to create new web part for every new list or library. That’s where Knockout popped. Why don’t we use Knockout for templates instead XSLT.

    I’ll assume that whoever reads this article knows about creating a web part for SharePoint, SharePoint module, java script and html and I will not go into details.

    Background

    A bit about Knockout

    From Knockout web site: “Knockout is a JavaScript library that helps you to create rich, responsive display and editor user interfaces with a clean underlying data model. “

    From Wikipedia:

    Knockout is a standalone JavaScript implementation of the Model-View-ViewModel pattern with templates. The underlying principles are therefore:

    • a clear separation between domain data, view components and data to be displayed
    • the presence of a clearly defined layer of specialized code to manage the relationships between the view components

    Knockout includes the following features:

    • Declarative bindings
    • Automatic UI refresh (when the data model’s state changes, the UI updates automatically)
    • Dependency tracking
    • Templating (using a native template engine although other templating engines can be used, such as jquery.tmpl)

    So what’s the deal?

    First you have your view model:

     var myViewModel = {
         personName: 'Bob',
         personAge: 123
    };

    Then you have a view:

    The name is <span data-bind="text:personName"></span>

    At the end just bind your view to model

     ko.applyBindings(myViewModel);

    We’ll talk about model later.

    Using the code

    Proof of concept

    I’ve created an html mock of our web part. This is useful, because we can prepare java scripts, css files, models and views in advance and test it without SharePoint and visual studio.

    You can download proof of concept as separate download from the link above.

    References

    There would be only two file references.

    One is knockout library itself

    <script type='text/javascript' src="http://knockoutjs.com/downloads/knockout-3.0.0.js"></script>

    and the other is css file I’ve added to this project

    <link href="css/controls.css" rel="stylesheet" type="text/css" />

    Model 

    I’ve designed model as Item class. Here it is:

    // Item class definition
    var Item = function (id, title, datecreated,url,description,thumbnail) {
       this.id = id;
       this.title = title;
       this.datecreated = datecreated;
       this.url=url;
       this.description=description;
       this.thumbnail=thumbnail;
    }

    It’s called item and it has 6 properties:

    1. id – ID of the item
    2. title – Title of the item
    3. datecreated – Creation date of the item
    4. url – Url of the item
    5. description – Description of the item
    6. thumbnail – Thumbnail of the item

     

    View model

    Here is the view model

    function viewModel1 (){
        var self = this;
        self.items =  [  
         new Item(2, 'News1 title','21.10.2013','javascript:OpenDialog(2);'
                   ,'Description News 1','img/pic1.jpg'), 
        new Item(1, 'News 2 title','21.02.2013','javascript:OpenDialog(1);',
                   'Description News 2','img/pic2.jpg')
    }

    View model has property items, which in fact is collection of Item objects. For mocking purposes we’ve added two Item objects in this collection (News 1 and News 2);

     

    View

    Here is the view:

    <div class="glwp glwp-central" id="k1">
      <div class="glwpLine"></div>
      <h5><img src="PublishingImages/siteIcon.png" 
              width="28" height="28" align="absmiddle" />
          News</h5>
      <div class="glwpLineGrey"></div>
        <ul data-bind="foreach:items">
          <li>
           <div class="glwpDate"><span data-bind="text: datecreated" ></span>
           <img class="glwpImage" data-bind="attr: { src: thumbnail }" />         
           </div>
           <div class="glwpText glwpText-central" >
            <a data-bind="attr: { href: url, title: title }" style="min-height:70px;">
             <span class="glwpTextTTL" data-bind="text:title"></span><br />
             <span data-bind="text: description"></span>
            </a>
           </div>
           <div class="glwpSep"></div>
          </li>
        </ul>
    </div>

    What we have here:

    It’s pretty simple. We haveunordered list bound to our model. One

    • element would be created for every item of our items collection (data-bind=”foreach: items”).

     

     

    Property binding: 

    •  datecreated">< /span> – This is the simplest data binding. It would write datecreated property of Item object to text of span element (like: <span>11/11/2013</span>)
    • <img class="glwpImage" data-bind="attr: { src: thumbnail }" />. This is a bit more complicated binding. It would take thumbnail property of item object and write it to src attribute of img element.
    • 70px;">. It would take url property and write it as href attribute of the a element, and title property as title attribute.
    • <span class="glwpTextTTL" data-bind="text:title"></span>. Title property would be written as text of span element
    • <span data-bind="text: description"></span>. Description property would be written as text of span element

    So anyone with little knowledge of html and css can customize this template anyway (s)he likes, as long as (s)he provides required properties.

     

    Binding

    ko.applyBindings(viewModel1,document.getElementById('k1'));

    Note second parameter in applyBindings method. It says document.getElementById('k1'). Same id is on the first div in our view (k1″>). This is helpful if you want to have more than one view model in one page. It tells knockout to bind this specific model (viewModel1) to specific template on our page (k1).

     

    What we have from this? We are going to create web part from this code and one of the web part features is that you can put same web part several times on the same page. So it would be possible to put one web part in SharePoint page to display news and one web part to display projects or documents. And they will coexist together.

    If you look at the source you will notice that we have 2 view models (viewModel1 and viewModel2) and two templates (k1 and k2), and two bindings of course. One binding is for news (with images and description) and one binding is for files (no images, and no descriptions). Templates are slightly different.

    Final result

    Here is the final result

    SharePoint Part

    As I said I will assume that you have some experience with SharePoint development so I will not explain how to create the project and add project items. Project type is standard Visual Studio 2010 SharePoint Empty Project template.

    SharePoint part consists of following items:

    • Web part item – KnockoutWp. Standard SharePoint Visual Web part project Item
    • Assets module. SharePoint module project item. We are going to use it for deploying of images and css files (0.png – empty container for images and controls.css – css file for our projects).
    • Layouts mapped folder. We’ll put here editor page for template.

    And here is the solution explorer for project:

    Assets

    We are going to deploy 2 files:

    • 0.png – 1×1 pixel transparent image aka placeholder
    • Controls.css – css file for our template

    Both of these items are going to be deployed to Style Library of the SharePoint site collection, so content editors may change it later without need of solution redeployment.

    Here is the elements.xml file:

    So our assets will end to http://oursitecollectionurl/Style Library/wp folder.

    KnockoutWp

    This is Visual Studio 2010 Visual Web part.

    It is consisted of 4 items:

    • KnockoutWp.cs – web part class
    • KnockoutWpUserControl – User control of our web part
    • KnockoutWp.webpart – web part xml file
    • Elements.xml – manifest file

    Properties

    Web part has following properties:

    • ListUrl (string, required) – url of the list we are displaying.
    • TitleField (string, optional) – display name of the field that would be displayed as Title. If it’s blank Title field would be used.
    • DateField (string, optional) – display name of the field that would be displayed as date. If it’s blank Created field would be used.
    • DescriptionField (string, optional) – display name of the field that would be displayed as Description. If it’s blank it would be omitted.
    • ImageField (string, optional) – display name of the field that would be displayed as Thumbnail picture. If it’s blank it would be omitted.
    • NoOfItems (int) – how many items from the list would be displayed
    • ItemTemplate (string) – html template of the web part. Defines the look of our web part.
    • WpPosition (enum) – Used for a three column layouts. Web part has styles for three zones: right, central and left. Difference is in width, padding and margin. Everything is set in css so you can accommodate it to your environment.

    On picture below you can see mapping between Field properties of web part and list item fields.

     

    EditorPart

    I’ve added one more thing to this web part it’s EditorPart class GenericListPartEditorPart. I’m not going into deep with editor parts, but here is quick info. When you create public property for a web part it is automatically displayed in web part edit panel.

    And it is great concept when you need simple properties as strings, numbers and short lists. If you want more complicated scenario (as we want here for our web part) it’s not enough.

    What I wanted here is template editor. It could be reasonably large so idea was to have a button in web part edit panel that would open large dialog window with editor. User would work with our template, click Apply and change ItemTemplate web part property.

    Template editor KnockoutWpUserControl

    This is user control created by Visual Studio, when we added Visual web part project item to the project. It consists of markup ascx file and code behind .ascx.cs file. We will put our markup and our c# code here.

    Markup

    Here is the complete markup:

    <script type='text/javascript' src="http://knockoutjs.com/downloads/knockout-3.0.0.js">
    </script>
    <style type="text/css">  @import url("/Style
    Library/wp/controls.css");  </style>  
    <div class="glwp glwp-<%=PositionClass %>" id="k<%=WpId %>">
      <div class="glwpLine"></div>      
      <h5><img src="<%=Icon %>" width="28" 
        height="28" align="absmiddle"><%=Title %></h5>
        <div class="glwpLineGrey"></div>      
      <asp:Literal ID="LitLayout" runat="server"></asp:Literal>
    </div>  
    
    <script type="text/javascript">    
      function OpenDialog(Url) {
        var options = SP.UI.$create_DialogOptions();        
        options.resizable = 1;        
        options.scroll = 1;        
        options.url = Url;
        SP.UI.ModalDialog.showModalDialog(options);    
    }         
    // Item class         
      var Item = function (id, title, datecreated,url,description,thumbnail) {            
         this.id = id;            
         this.title = title;
         this.datecreated = datecreated;
         this.url=url;
         this.description=description;
         this.thumbnail=thumbnail;
      }         
     //ViewModel goes here (It's created on server)        
     runat="server" ID="LitItems"></asp:Literal>
     
    //Function that opens Template editor. Used only in edit mode of web part       
     function portal_openTemplateEditor(wpid) {       
      var val="";              
      var options = SP.UI.$create_DialogOptions();              
      options.width = 600;             
      options.height = 500;                
      options.url = "/_layouts/KnockoutTemplate/TemplateEditor.aspx?c="+wpid;//"";
      options.dialogReturnValueCallback =
               Function.createDelegate(null,portal_openTemplateEditorClosedCallback);
      SP.UI.ModalDialog.showModalDialog(options);
    }
    </script>

    First Section, of the markup (picture below) has script (knockout, on the remote server) and style references (controls.css in local Document library). Below is html markup that defines the container of the web part (top and bottom borders, width, icon and title). Markup is not the cleanest because I was little lazy and left some public properties in it. Note< %=PositionClass%>, <%=WpId%> and so on.

    There are all public properties of the user control and they are used for presentation:

    • PositionClass – depending on WpPosition web part property (right, central or left) adds appropriate css class to markup and that way defines width, padding and margin of web part WpId is guid of the web part. It is used to uniquely identify the web part, because we can put several web parts of the same type and everything would crush without this identificator.
    • Icon – is a url to icon that would be displayed on web part. Web part property Title Icon Image URL is used here (this is OOB property)
    • Title –title text of the web part. Text that was entered in the title area of the web part. Web part property Title is used here (this is OOB property)

    Last interesting thing here is Literal control LitLayout. This control would hold our ItemTemplate property (html template of our web part).

    Second section, is a java script function that opens list item in a dialog window. It is used when underlying list is not document library.

    Third section consists of knockout view model (java script). Item class definition is self-explanatory (defines 6 properties only). The rest of the model is created on the server side so now there is only LitItems Literal control there.

    Fourth section is just a java script function that is used when editing web part properties. This function opens template editor in dialog window.

    Code

    Properties:

    • Properties from web part
      • Icon – url to the icon
      • Title – title of the web part
      • ListUrl – url to the list
      • TitleField – Title field in the list
      • DateField – Date field in the list
      • ImageField – Image field in the list
      • DescriptionField – Description field in the list
      • NoOfItems – number of items to return
      • Position – position of the web part (right, left or central)
      • ItemTemplate – html template of the web part
      • WpId – guid id of the web part ·
    • UC’s properties
      • PositionClass – css class based on position
      • ColumnMap – dictionary that holds internal names of the list item fields.

    Methods: File has only one method Page_Load. Code is executing with elevated privileges.

    In that method we:

    1. Resolve list by the supplied URL (ListUrl property) SPList annList = annWeb.GetList(ListUrl);
    2. Get internal names of the list columns by their Display names SpHelper.GetFieldsInternals(annWeb, annList.Title, TitleField, DateField, DescriptionField, ImageField, columnMap );
    3. Create CAML Query SpHelper.GetGenericQuery(annList, q, NoOfItems);
    4. Execute it
    5. Iterate over SPListItemCollection (coll) and create required JavaScript
    Helper class

    SPHelper is helper class and you can find it in Helpers directory.

    It has 3 responsibilities:

    1. To retrieve List Columns Internal names based on supplied List Columns display names (WP properties – TitleField – Title field, DateField, ImageField , DescriptionField ) – GetFieldsInternals method
    2. To create Caml query for retrieving list items – GetGenericQuery method
    3. To retrieve values from SharePoint columns based on their types – GetFieldValue method

     

    System Center Virtual Machine Manager (VMM) 2012 as Private Cloud Enabler (3/5): Deployment with Service Template

    By this time, I assume we all have some clarity that virtualization is not cloud. There are indeed many and significant differences between the two. A main departure is the approaches of deploying apps. In the 3rd article of the 5-part series as listed below, I would like to examine service-based deployment introduced in VMM 2012 for building a private cloud.

    • Part 1. Private Cloud Concepts
    • Part 2. Fabric, Oh, Fabric
    • Part 3. Deployment with Service Template (This article)
    • Part 4. Working with Service Templatesimage
    • Part 5. App Controller

    VMM 2012 has the abilities to carry out both traditional virtual machine (VM)-centric and emerging service-based deployments. The formal is virtualization-focused and operated at a VM level, while the latter is service-centric approach and intended for private cloud deployment.

    This article is intended for those with some experience administering VMM 2008 R2 infrastructure. And notice in cloud computing, “service” is a critical and must-understand concept which I have discussed elsewhere. And just to be clear, in the context of cloud computing, a “service” and an “application” means the same thing, since in cloud everything to user is delivered as a service, for example SaaS, PaaS, and IaaS. Throughout this article, I use the terms, service and application, interchangeably.

    VM-Centric Deployment

    In virtualization, deploying a server has becomes conceptually shipping/building and booting from a (VHD) file. Those who would like to refresh their knowledge of virtualization are invited to review the 20-Part Webcast Series on Microsoft Virtualization Solutions.

    Virtualization has brought many opportunities for IT to improve processes and operations. With system management software such as System Center Virtual Machine Manager 2008 R2 or VMM 2008 R2, we can deploy VMs and installs OS to a target environment with few or no operator interventions. And from an application point of view, with or without automation the associated VMs are essentially deployed and configured individually.image For instance, a multi-tier web application like the one shown above is typically deployed with a pre-determined number of VMs, followed by installing and configuring application among the deployed VMs individually based on application requirements. Particularly when there is a back-end database involved, a system administrator typically must follow a particular sequence to first bring a target database server instance on line by configuring specific login accounts with specific db roles, securing specific ports, and registering in AD before proceeding with subsequent deployment steps. These operator interventions are required likely due to lack of a cost-effective, systematic, and automatic way for streamlining and managing the concurrent and event-driven inter-VM dependencies which become relevant at various moments during an application deployment.

    Despite there may be a system management infrastructure in place like VMM 2008 R2 integrated with other System Center members, at an operational level VMs are largely managed and maintained individually in a VM-centric deployment model. And perhaps more significantly, in a VM-centric deployment too often it is labor-intensive and with relatively high TCO to deploy a multi-tier application “on demand” (in other words, as a service) and deploy multiple times, run multiple releases concurrently in the same IT environment, if it is technically feasible at all. Now in VMM 2012, the ability to deploy services on demand, deploy multiple times, run multiple releases concurrently in the same environment become noticeably straightforward and amazing simple with a service-based deployment model.

    Service-Based Deployment

    In a VM-centric model, there lacks an effective way to address event-driven and inter-VMs dependencies during a deployment, nor there is a concept of fabric which is an essential abstraction of cloud computing. imageIn VMM 2012, a service-based deployment means all the resources encompassing an application, i.e. the configurations, installations, instances, dependencies, etc. are deployed and managed as one entity with fabric . The integration of fabric in VMM 2012 is a key delivery and clearly illustrated in VMM 2012 admin console as shown on the left. And the precondition for deploying services to a private cloud is all about first laying out the private cloud fabric.

    Constructing Fabric

    To deploy a service, the process normally employs administrator and service accounts to carry out the tasks of installing and configuring infrastructure and application on servers, networking, and storage based on application requirements. Here servers collectively act as a compute engine to provide a target runtime environment for executing code. Networking is to interconnect all relevant application resources and peripherals to support all management and communications need, while the storage is where code and data actually resides and maintained. In VMM 2012, the servers, networking, and storage infrastructure components are collectively managed with a single concept as private cloud fabric.

    There are three resource pools/nodes encompassing fabric: Servers, Networking, and Storage. Servers contain various types of servers including virtualization host groups, PXE, Update (i.e. WSUS) and other servers. Host groups are container to logically group servers with virtualization hosting capabilities and ultimately represent the physical boxes where VMs can be possibly deployed to, either with specific network settings or dynamically selected by VMM Intelligent Placement, as applicable, based on defined criteria. VMM 2012 can manage Hyper-V based, VMware, as well as other virtualization solutions. During adding a host into a host group, VMM 2012 installs an agent on a target host which then becomes a managed resource of the fabric.

    A Library Server is a repository where the resources for deploying services and VMs are available via network shares. As a Library Server is added into fabric, by specifying the network shares defined in the Library Server, file-based resources like VM templates, VHDs, iso images, service templates, scripts, server app-v packages, etc. are become available and to be used as building blocks for composing VM and service templates. As various types of servers are brought in the Server pool, the coverage expanded and capabilities increased as if additional fibers are weaved into fabric.

    Networking presents the wiring among resources repositories, running instances, deployed clouds and VMs, and the intelligence for managing and maintaining the fabric. It essentially forms the nervous system to filter noises, isolate traffic, and establish interconnectivity among VMs based on how Logical Networks and Network Sites are put in place.

    Storage reveals the underlying  storage complexities and how storage is virtualized. In VMM 2012, a cloud administrator can discover, classify and provision remote storage on supported storage arrays through the VMM 2012 console. VMM 2012 fully automates the assignment of storage to a Hyper-V host or Hyper-V host cluster, and tracks the storage that is managed by VMM 2012.

    Deploying Private Cloud

    A leading feature of VMM 2012 is the ability to deploy a private cloud, or more specifically to deploy a service to a private cloud. The focus of this article is to depict the operational aspects of deploying a private cloud with the assumption that an intended application has been well tested, signed off, and sealed for deployment. And the application resources including code, service template, scripts, server app-v packages, etc. are packaged and provided to a cloud administrator for deployment. In essence, this package has all the intelligence, settings, and contents needed to be deployed as a service. This self-contained package can then be easily deployed on demand by validating instance-dependent global variables and repeating the deployment tasks on a target cloud. The following illustrated the concept where a service is deployed in update releases and various editions with specific feature compositions, while all running concurrently in VMM 2012 fabric. Not only this is relative easy to do by streamlining and automating all deployment tasks with a service template, the service template can also be configured and deploy to different private clouds.

    image

    The secret sauce is a service template which includes all the where, what, how, and when of deploying all the resources of an intended application as a service. It should be apparent that the skill sets and amount of efforts to develop a solid service template apparently are not trivial. Because a service template not only needs to include the intimate knowledge of an application, but the best practices of Windows deployment in addition to system and network administrations, server app-v, and system management of Windows servers and workloads. The following is a sample service template of StockTrader imported into VMM 2012 and viewed with Designer where StockTrader is a sample application for cloud deployment downloaded from Windows Connect.

    image

    Here are the logical steps I follow to deploy StockTrader with VMM 2012 admin console:

    • Step 1: Acquire the Stock Trader application package from Windows Connect.
    • Step 2: Extract and place the package in a designated network share of a target Library Server of VMM 2012 and refresh the Library share. By default, the refresh cycle of a Library Server is every 60 minutes. To make the newly added resources immediately available, refreshing an intended Library share will validate and re-index the resources in added network shares.
    • Step 3: Import the service templates of Stock Trader and follow the step-by-step guide to remap the application resources.
    • Step 4: Identify/create a target cloud with VMM 2012 admin console.
    • Step 5: Open Designer to validate the VM templates included in the service template. Make sure SQLAdminRAA is correctly defined as RunAs account.
    • Step 6: Configure deployment of the service template and validate global variables in specialization page.
    • Step 7: Deploy Stock Trader to a target cloud and monitor the progress in Job panel.
    • Step 8: Troubleshoot the deployment process, as needed, restart the deployment job, and repeat the step as needed
    • Step 9: Upon successful deployment of the service, test the service and verify the results.

    A successful deployment of Stock Trader with minimal instances in my all-in-one-laptop demo environment (running in Lenovo W510 with sufficient RAM) took about 75 to 90 minutes as reported in Job Summary shown below.

    StockTraderDeployment

    Once the service template is successfully deployed, Stock Trader becomes a service in the target private cloud supported by VMM 2012 fabric. The following two screen captures show a Pro Release of Stock Trader deployed to a private cloud in VMM 2012 and the user experience of accessing a trader’s home page.

    image

    image

    Not If, But When

    Witnessing the way the IT industry has been progressing, I envision that private cloud will soon become, just like virtualization, a core IT competency and no longer a specialty. While private cloud is still a topic that is being actively debated and shaped, the upcoming release of VMM 2012 just in time presents a methodical approach for constructing private cloud based on a service-based deployment with fabric. It is a high-speed train and the next logical step for enterprise to accelerate private cloud adoption.

    Closing Thoughts

    I here forecast the future is mostly cloudy with scattered showers. In the long run, I see a clear cloudy day coming.

    Be ambitious and opportunistic is what I will encourage everyone. When it comes to Microsoft private cloud, the essentials are Windows Server 2008 R2 SP1 with Hyper-V and VMM 2012. And those who first master these skills will stand out, become the next private cloud subject matter experts, and lead the IT pro communities. While recognizing private cloud adoption is not a technology issue, but a culture shift and an opportunity of career progression, IT pros must make a first move.

    In an upcoming series of articles tentatively titled “Deploying StockTrader as Service to Private Cloud with VMM 2012,” I will walk through the operations of the above steps and detail the process of deploying a service template to a private cloud

    Developing a Real Outlook Social Connector

    This section contains a set of four Visual How Tos that shows how to develop a real provider for the Microsoft Outlook Social Connector (OSC) by using the OSC Provider Proxy Library.

    Outlook.com_[1]

    An OSC provider allows Outlook users to view, in the People Pane, an aggregation of social information updates that are applied on a professional or social network site. An OSC provider is a Component Object Model (COM) DLL. The OSC provider extensibility interfaces form the medium through which the OSC and an OSC provider communicate. OSC provider extensibility consists of a set of interfaces that is available as an open platform. These interfaces allow the OSC to access social network data in a way that is independent of the APIs of each social network. An OSC provider obtains social network data from the corresponding social network and, through implementing the extensibility interfaces, feeds that social network data to the OSC.

    The OSC Provider Proxy Library simplifies the implementation of the OSC provider extensibility interfaces. Instead of a provider explicitly implementing the OSC provider extensibility interfaces, the proxy library implements them, to call a set of abstract and virtual methods in the proxy library.

    A provider, in turn, overrides this set of abstract and virtual methods with the business logic specific to the social network, to return social network data that the OSC requires.

    To show how a provider can use the OSC Provider Proxy Library, this set of Visual How Tos describes a real provider for OfficeTalk. OfficeTalk is a social network in a private corporate environment and is not publicly available.

    Nonetheless, it is a good example of the kind of social network that you might want to develop a custom OSC provider for. You can use the procedures for creating the OSC provider for OfficeTalk to create a custom OSC provider for any social network.

    Developing a Real Outlook Social Connector Provider by Using a Proxy Library

     

    Overview

    The Microsoft Outlook Social Connector (OSC) provides a communication hub for personal and professional communications. Just by selecting an Outlook item such as an email or meeting request and clicking the sender or a recipient of that item, users can see, in the People Pane, activities, photos, and status updates for the person on their favorite social networks.

    The OSC obtains social network data by calling an OSC provider, which behaves like a translation layer between Outlook and the social network. The OSC provider model is open, and you can develop a custom OSC provider by implementing the required OSC provider extensibility interfaces. To retrieve social network data, the OSC makes calls to the OSC provider through these interface members. The OSC provider communicates with the social network and returns the social network data to the OSC as a string or as XML that conforms to the Outlook Social Connector XML schema. Figure 1 shows the various components of the sample OfficeTalk OSC provider reviewed in this Visual How To.

    Figure 1. Relationships of the sample OfficeTalk OSC provider with related components

    Relationship of sample provider with components

    This Visual How To shows the procedures to create a custom OSC provider for OfficeTalk. OfficeTalk is not publicly available and is being used as an example of the kind of social network you might want to develop a custom OSC provider for. You can use the procedures for creating the OSC provider for OfficeTalk to create a custom OSC provider for any social network.

    The OfficeTalk provider uses the Outlook Social Connector Provider Proxy Library to simplify the implementation of the OSC provider extensibility interfaces. The OSC Provider Proxy Library implements all of the OSC provider extensibility interface members. These interface members, in turn, call a consolidated set of abstract and virtual methods that provide the social network data that the OSC requires. To create a custom OSC provider that uses the OSC Provider Proxy Library, a developer overrides these abstract and virtual methods with the business logic to communicate with the social network.

    Code It

    The sample solution for this article includes all of the code for a custom OSC provider for OfficeTalk. However, this Visual How To does not show all of the code in the sample solution. Instead, it focuses on creating a custom OSC provider by using the OSC Provider Proxy Library.

    The sample solution contains two projects:

    • OSCProvider—This project is an unmodified version of the OSC Provider Proxy Library that is used to simplify the creation of the OfficeTalk OSC provider.
    • OfficeTalkOSCProvider—This project includes the source code files that are specific to the OfficeTalk OSC provider.

    The OfficeTalkOSCProvider project includes the following source code files:

    • OfficeTalkHelper—This class contains helper methods that are used throughout the sample solution.
    • OTProvider—This is a partial class that contains the OSC Provider Proxy Library override methods that return information about the OSC provider, information about the social network, and information for the current user.
    • OTProvider_Activities—This is a partial class that contains the OSC Provider Proxy Library override methods that return activity information.
    • OTProvider_Friends—This is a partial class that contains the OSC Provider Proxy Library override methods that return friends information.

    Creating the OfficeTalk OSC Provider Solution

    The following sections show the procedures to create the OfficeTalk OSC provider sample solution, and add OSC Provider Proxy Library override methods to return information about the OSC provider, the social network, and the current user.

    You must create the OSC provider as a class library. For this Visual How To, the solution was created with a name of OfficeTalkOSCProvider.

    Adding the OSC Provider Proxy Library Project

    You must download the Outlook Social Connector Provider Proxy Library from MSDN Code Gallery, and then extract it to the local computer.

    To add the OSC Provider Proxy Library to the OfficeTalkOSCProvider solution

    1. Copy the OSCProvider project to the OfficeTalkOSCProvider directory.
    2. On the File menu in Visual Studio 2010, point to Add, and then click Existing Project.
    3. Select the OSCPRovider.csproj project that you copied in Step 1.

    Adding References

    Add the following references to the OfficeTalkOSCProvider:

    • Outlook Social Provider COM component. The name in the COM tab is Microsoft Outlook Social Provider Extensibility. If there are multiple versions, select TypeLib Version 1.1.
    • System.Drawing

    Adding Social Network Specific References and Files

    Add other appropriate references and files for the social network. The sample solution does not include the OfficeTalk API assembly. To support the social network for which you are developing an OSC provider, replace the OfficeTalk API references and files with the references and files that are specific to your social network.

    The sample solution for OfficeTalk contains the following references and files:

    • The OfficeTalk API assembly.
    • The OfficeTalk icon file.

    Creating a Subclass of the OSC Provider Proxy Library OSCProvider

    Use the OSC Provider Proxy Library to create a subclass of the OSCProvider class, OTProvider, which represents the sample OSC provider. Add a class named OTProvider to the OfficeTalkOSCProvider project. OTProvider is defined as a partial class so that logic for OSC provider core methods, friends, and activities can be defined in separate source code files.

    Replace the class definition with the code in the following section. The code example starts with the using statements for the OSC Provider Proxy Library and OfficeTalk API. The OTProvider partial class then inherits from the OSCProvider class. Note that the OTProvider class has the ComVisible attribute so that the Outlook Social Connector can call it.

    Copy
    using System;
    using System.Globalization;
    using System.Collections.Generic;
    using System.IO;
    using System.Reflection;
    using System.Drawing;
    using System.Drawing.Imaging;
    
    // Using statements for the OSC Provider Proxy Library.
    using OSCProvider;
    using OSCProvider.Schema;
    
    // Using statements for the social network.
    using OfficeTalkAPI;
    
    namespace OfficeTalkOSCProvider
    {
        // SubClass of the OSC Provider Proxy Library OSCProvider
        // used to create a custom OSC provider.
        [System.Runtime.InteropServices.ComVisible(true)]
        public partial class OTProvider : OSCProvider.OSCProvider
        {
        ...
    
    

    After the OTProvider class is defined, add the following code for constants used throughout the OfficeTalkOSCProvider solution.

    Copy
    // Constants for the OfficeTalk OSC provider.
    internal static string NETWORK_NAME = @"OfficeTalk";
    internal static string NETWORK_GUID = @"YourNetworkGuid";
    internal static string API_VERSION = @"YourApiVersion";
    internal static string API_URL = @"YourApiUrl";
    internal static OSCProvider.ProviderSchemaVersion SCHEMA_VERSION =
        ProviderSchemaVersion.v1_1;
    
    

    Allowing for Debugging

    To debug the OfficeTalkOSCProvider, you must modify the OfficeTalkOSCProvider project to start using Outlook and register the OfficeTalkOSCProvider as an Outlook Social Connector.

    To set up the OfficeTalkOSCProvider project for debugging

    1. Right-click the OfficeTalkOSCProvider project, and then click Properties.
    2. Select the Debug tab.
    3. Under Start Action, select Start External Program.
    4. Specify the full path to the version of Outlook that is installed on your computer. The default path for 32-bit Outlook on 32-bit Windows is C:\Program Files\Microsoft Office\Office14\OUTLOOK.EXE.

    The Outlook Social Connector will not call the OfficeTalkOSCProvider until it is registered as an OSC provider. The sample solution includes a file named RegisterProvider.reg that updates the registry with the entries that are required to register the OfficeTalkOSCProvider as an OSC provider. You can update the registry by opening the RegistryProvider.reg file in Windows Explorer.

    The RegisterProvider.reg file assumes that the sample solution is located in the C:\temp directory. If the sample solution is located in a different directory, update the CodeBase entry in the RegisterProvider.reg file to point to the correct location.

    Adding Helper Methods

    The OfficeTalkHelper class contains helper methods, including the GetOfficeTalkClient and ConvertUserToPerson methods, that are used throughout the sample solution.

    The following GetOfficeTalkClient method returns an OfficeTalkClient object that is used to communicate with OfficeTalk. If the OfficeTalkClient has not been initialized, GetOfficeTalkClient creates and configures a new OfficeTalkClient by using the API_URL and API_VERSION constants that are defined in OTProvider.

    Copy
    // Returns a reference to the OfficeTalk client.
    private static OfficeTalkClient officeTalkClient = null;
    internal static OfficeTalkClient GetOfficeTalkClient()
    {
        if (officeTalkClient == null)
        {
            officeTalkClient =
              new OfficeTalkClient(OTProvider.API_URL);
            OfficeTalkClient.UserAgent =
              @"OfficeTalkOSC/" + OTProvider.API_VERSION;
        }
        return officeTalkClient;
    }
    
    

    The ConvertUserToPerson method converts an OfficeTalk User object to an OSC Provider Proxy Library Person object that is usable within the OSC Provider Proxy Library. The ConvertUserToPerson method creates a new OSC Provider Proxy Library Person and then maps the User properties to the related Person properties.

    Copy
    // Converts an Office Talk User to an OSC Provider Proxy Library Person.
    internal static Person ConvertUserToPerson(OfficeTalkAPI.OTUser user)
    {
        // Create the OSC Provider Proxy Library Person.
        Person person = new Person();
    
        // Map the User properties to the Person properties.
        person.FullName = user.name;
        person.Email = user.email;
        person.Company = user.department;
        person.UserID = user.id.ToString(CultureInfo.InvariantCulture);
        person.Title = user.title;
        person.CreationTime = user.created_atAsDateTime;
    
        // FriendStatus is based on whether the user is being followed 
        // by the currently logged-on user.
        person.FriendStatus = 
            user.following ? FriendStatus.friend : FriendStatus.notfriend;
    
        // Set the PictureUrl if a profile picture is loaded in OfficeTalk.
        if (user.image_url != null)
        {
            person.PictureUrl = new Uri(OTProvider.API_URL + user.image_url);
        }
    
        // WebProfilePage is set to the user's home page in OfficeTalk.
        person.WebProfilePage = 
            OTProvider.API_URL + @"/Home/index/" + user.alias + "#User";
    
        return person;
    }
    
    

    Overriding the GetProviderData Method

    The OSC ISocialProvider interface contains members that return information about the OSC provider. This includes the capabilities of the social network, how to communicate with the social network, and general information about the social network. The OSC Provider Proxy Library provides the GetProviderData abstract method, which you can override to return OSC provider information. The GetProviderData abstract method returns the OSC Provider Proxy Library ProviderData object, which encapsulates the provider information.

    The following section of the GetProviderData override method initializes a ProviderData object and sets the properties for the OfficeTalk provider.

    Copy
    // The ProviderData contains information about the social network and is 
    // used by the OSC ISocialProvider members to return information.
    ProviderData providerData = new ProviderData();
    
    // Friendly name of the social network to display in Outlook.
    providerData.NetworkName = NETWORK_NAME;
    
    // GUID that represents the social network.
    // This GUID should not change between versions.
    providerData.NetworkGuid = new Guid(NETWORK_GUID);
    
    // Version of the social network provider.
    providerData.Version = API_VERSION;
    
    // Array of URLs that the social network provider uses.
    // The default URL should be the first item in the array.
    providerData.Urls = new string[] { API_URL };
    
    // The icon of the social network to display in Outlook.
    Byte[] icon = null;
    Assembly assembly = Assembly.GetExecutingAssembly();
    using (Stream imageStream =
        assembly.GetManifestResourceStream("OfficeTalkOSCProvider.OTIcon16.bmp"))
    {
        using (MemoryStream memoryStream = new MemoryStream())
        {
            using (Image socialNetworkIcon = Image.FromStream(imageStream))
            {
                socialNetworkIcon.Save(memoryStream, ImageFormat.Bmp);
                icon = memoryStream.ToArray();
            }
        }
    }
    providerData.Icon = icon;
    
    

    The following section of the GetProviderData override method uses the Proxy Library Capabilities class to identify the capabilities and requirements for the OfficeTalk OSC provider. The Capabilities class defines capabilities by setting the CapabilityFlags property. The CapabiltiesFlag property uses a bitmask and is set by using the bitwise OR operator to combine constants that the OSC Provider Proxy Library has defined for each capability.

    Copy
    // Define the capabilities for the provider.
    // The Capabilities object will generate the appropriate XML string.
    Capabilities capabilities = new Capabilities(SCHEMA_VERSION);
    capabilities.CapabilityFlags =
        // OSC should call the GetAutoConfiguredSession method to get a 
        // configured session for the user.
        Capabilities.CAP_SUPPORTSAUTOCONFIGURE |
    
        // OSC should hide all links in the Account configuration dialog box.
        Capabilities.CAP_HIDEHYPERLINKS |
        Capabilities.CAP_HIDEREMEMBERMYPASSWORD |
    
        // The following activity settings identify that Activities uses
        // hybrid synchronization.
        // OSC will store activities for friends in a hidden folder and 
        // activities for non-friends in memory.
        Capabilities.CAP_GETACTIVITIES |
        Capabilities.CAP_DYNAMICACTIVITIESLOOKUP |
        Capabilities.CAP_DYNAMICACTIVITIESLOOKUPEX |
        Capabilities.CAP_CACHEACTIVITIES |
    
        // The following Friends settings identify that friend information
        // uses hybrid synchronization.
        // OSC will call the GetPeopleDetails method every time the People Pane 
        // is refreshed to ensure the latest user information is displayed.
        Capabilities.CAP_GETFRIENDS |
        Capabilities.CAP_DYNAMICCONTACTSLOOKUP |
        Capabilities.CAP_CACHEFRIENDS |
    
        // The following Friends settings identify that OfficeTalks supports
        // the FollowPerson and UnFollowPerson calls.
        Capabilities.CAP_DONOTFOLLOWPERSON |
        Capabilities.CAP_FOLLOWPERSON;
    
    // Set the email HashFunction.
    // Setting the EmailHashFunction is required if CAP_DYNAMICCONTACTSLOOKUP
    // or CAP_DYNAMICACTIVITIESLOOKUPEX are set.
    capabilities.EmailHashFunction = HashFunction.SHA1;
    
    // Set the capabilities property on the providerData object.
    providerData.ProviderCapabilities = capabilities;
    
    

    The capabilities and requirements defined in the preceding code example are specific to OfficeTalk. A custom OSC provider that is developed for a different social network must define a set of capabilities and requirements that are specific to that social network.

    The following list shows the CapabilityFlag constants that are available in the OSC Provider Proxy Library Capabilities class.

    CAP_SUPPORTSAUTOCONFIGURE
    The provider supports calling the ISocialProvider.GetAutoConfiguredSession method to attempt automatic configuration of the network for the user.
    CAP_GETFRIENDS
    The provider supports the ISocialPerson.GetFriendsAndColleagues or ISocialSession2.GetPeopleDetails method. The OSC uses the CAP_CACHEFRIENDS and CAP_DYNAMICCONTACTSLOOKUP settings to determine whether friends are stored as Outlook contact items or are stored in memory.
    CAP_CACHEFRIENDS
    The provider supports storing friends as Outlook contact items in a social-network-specific contacts folder.
    CAP_DYNAMICCONTACTSLOOKUP
    The provider supports the ISocialSession2.GetPeopleDetails method for on-demand synchronization of friends and non-friends. If CAP_DYNAMICCONTACTSLOOKUP is set, the OSC calls the ISocialSession2.GetPeopleDetails method every time the People Pane is refreshed.
    CAP_SHOWONDEMANDCONTACTSWHENMINIMIZED
    Indicates that the OSC should carry out on-demand synchronization for friends and non-friends when the People Pane is minimized.
    CAP_FOLLOWPERSON
    The provider supports the ISocialSession.FollowPerson method for adding the person as a friend on the social network.
    CAP_DONOTFOLLOWPERSON
    The provider supports the ISocialSession.UnFollowPerson method for removing the person as a friend on the social network.
    CAP_GETACTIVITIES
    The provider supports the ISocialPerson.GetActivities or ISocialSession2.GetActivitiesEx method. The OSC uses the CAP_CACHEACTIVITIES and CAP_DYNAMICACTIVITIESLOOKUPEX settings to determine whether activities are stored as Outlook RSS items or are stored in memory.
    CAP_CACHEACTIVITIES
    The provider supports storing activities as Outlook RSS items in a hidden News Feed folder. To support cached synchronization of activities CAP_CACHEACTIVITIES should be set and CAP_DYNAMICACTIVITIESLOOKUPEX should not be set. With cached synchronization of activities, the OSC stores all activities as Outlook RSS items in a hidden News Feed folder. To support hybrid synchronization of activities, both CAP_CACHEACTIVITIES and CAP_DYNAMICACTIVITIESLOOKUPEX should be set. With hybrid synchronization of activities, the OSC stores activities for friends as Outlook RSS items in a hidden News Feed folder and caches activities for non-friends in memory. To support on-demand synchronization of activities, CAP_CACHEACTIVITIES should not be set and CAP_DYNAMICACTIVITIESLOOKUPEX should be set. With on-demand synchronization of activities, the OSC caches all activities in memory.
    CAP_DYNAMICACTIVITIESLOOKUP
    Deprecated in OSC 1.1. Use the CAP_DYNAMICACTIVITIESLOOKUPEX setting instead.
    CAP_DYNAMICACTIVITIESLOOKUPEX
    The provider supports the ISocialSession2.GetActivitiesEx method for on-demand or hybrid synchronization of activities. To support on-demand synchronization of activities, CAP_DYNAMICACTIVITIESLOOKUPEX should be set and CAP_CACHEACTIVITIES should not be set. With on-demand synchronization of activities, the OSC calls ISocialSession2.GetActivitiesEx every time the People Pane is refreshed. To support hybrid synchronization of activities, both CAP_DYNAMICACTIVITIESLOOKUPEX and CAP_CACHEACTIVITIES should be set. With hybrid synchronization of activities, the OSC calls ISocialSession2.GetActivitiesEx every 30 minutes to refresh activities information. When CAP_DYNAMICACTIVITIESLOOKUPEX is not set, the OSC does not call ISocialSession2.GetActivitiesEx.
    CAP_SHOWONDEMANDACTIVITIESWHENMINIMIZED
    Indicates that the OSC should carry out on-demand synchronization for activities when the People Pane is minimized.
    CAP_DISPLAYURL
    Indicates that the OSC should display the network URL in the account configuration dialog box.
    CAP_HIDEHYPERLINKS
    Indicates that the OSC should hide the “Click here to create an account” and the “Forgot your password?” hyperlinks in the account configuration dialog box.
    CAP_HIDEREMEMBERMYPASSWORD
    Indicates that the OSC should hide the Remember my password check box in the account configuration dialog box.
    CAP_USELOGONWEBAUTH
    Indicates that the OSC should use forms-based authentication. When CAP_USELOGONWEBAUTH is set, the OSC uses forms-based authentication and calls the ISocialSession.LogonWeb method. When CAP_USELOGONWEBAUTH is not set, the OSC uses basic authentication and calls the ISocialSession.Logon method.
    CAP_USELOGONCACHED
    The provider supports the ISocialSession2.LogonCached method to log on with cached credentials. When CAP_USELOGONCACHED is set, the OSC ignores the CAP_USELOGONWEBAUTH setting and calls ISocialSession2.LogonCached for authentication.

    Overriding the GetMe Method

    Many of the OSC interface members and OSC Provider Proxy Library override methods require information about the current user. The OSC Provider Proxy Library provides the GetMe abstract method, which you can override to return information about the current user from the social network. The GetMe abstract method returns a Person object, which contains all social network data for the current user.

    The GetMe override method shown in the following example gets an OfficeTalkClient object to communicate with OfficeTalk. The GetMe override method then calls the OfficeTalk GetUser method by using the user name that is used to log on to Windows. After obtaining the OfficeTalk User, the GetMe override method calls the OfficeTalkHelper ConvertUserToPerson method to convert the OfficeTalk User to a Person that can be used within the OSC Provider Proxy Library.

    After the conversion is complete, the GetMe override method sets the Person.UserName property for the ISocialSession.LoggedOnUserName interface member. Only the GetMe override method sets the Person.UserName property when it returns information about the current user.

    Copy
    // OSC Proxy Library override method used to return information 
    // for the current user.
    public override Person GetMe()
    {
        // Get a reference to the OfficeTalk client.
        OfficeTalkClient officeTalkClient =
            OfficeTalkHelper.GetOfficeTalkClient();
    
        // Look up the user based on credentials used to log on to Windows.
        OTUser user =
            officeTalkClient.GetUser(System.Environment.UserName, Format.JSON);
    
        // Convert the OfficeTalk User to an OSC Provider Proxy Person.
        Person p = OfficeTalkHelper.ConvertUserToPerson(user);
    
        // Set the UserName property.
        // This is used only by the Person that the GetMe method returns to
        // support the OSC ISocialSession.LoggedOnUserName property.
        p.UserName = System.Environment.UserName;
    
        return p;
    }
    
    

    Overriding OSC Provider Proxy Library Friends Methods

    A custom OSC provider that uses the OSC Provider Proxy Library must override the abstract and virtual methods for returning friends social network data. In the sample solution, the overrides for these OTProvider methods are located in the OTProvider_Friends source file.

    The abstract and virtual methods for friends are as follows:

    • GetPeopleDetails—Returns detailed user information for the email addresses that are passed into the method.
    • GetFriends—Returns a list of friends for the current user.
    • FollowPersonEx—Adds the person who is identified by the email address as a friend on the social network.
    • UnFollowPerson—Removes the person who is identified by the user ID as a friend on the social network.

    Reviewing these methods is outside of the scope of this Visual How To. For more information about returning friends social network data, see Part 2: Getting Friends Information by Using the Proxy Library for Outlook Social Connector Provider Extensibility.

    Overriding OSC Provider Proxy Library Activity Methods

    A custom OSC provider that uses the OSC Provider Proxy Library must override the abstract and virtual methods for returning activity social network data. In the sample solution, the overrides for these OTProvider methods are located in the OTProvider_Activities source file.

    There is only one method to override for activities:

    • GetActivities—Returns activities for all users who are identified by the email addresses that are passed into the method.

    Covering these methods in detail is outside of the scope of this Visual How To. For more information about returning activities social network data, see Part 3: Getting Activities Information by Using the Proxy Library for Outlook Social Connector Provider Extensibility Visual How To.

    Read It

    Creating a custom Outlook Social Connector (OSC) provider for a social network is a straightforward process of implementing the OSC Provider extensibility interfaces to return social network data.

    The OSC Provider Proxy Library simplifies this process by removing the requirement to implement each individual interface member. Instead the OSC Provider Proxy Library defines a consolidated set of abstract and virtual methods to provide social network data. The developer of the OSC provider can focus on overriding these methods with the business logic required to interface with the social network API.

    The sample solution for this article includes all of the code required for a custom OSC provider for OfficeTalk. This Visual How To does not cover all of the code in the sample solution. This Visual How To focuses on creating a custom OSC provider solution, and returning information about the OSC provider, the social network capabilities, and the current user. The social network data that the OfficeTalk provider returns is shown in Figure 2.

    Figure 2. OSC showing OfficeTalk social network data in the People Pane

    OfficeTalk social network data in the People Pane

    For more information about returning friends social network data, see Part 2: Getting Friends Information by Using the Proxy Library for Outlook Social Connector Provider Extensibility.

    For more information about returning activities social network data, see Part 3: Getting Activities Information by Using the Proxy Library for Outlook Social Connector Provider Extensibility.

    System Center Virtual Machine Manager (VMM) 2012 as Private Cloud Enabler (2/5): Fabric, Oh, Fabric

    Aside from public cloud, private cloud, and something in between, the essence of cloud computing is fabric. The 2nd article of this 5-part series is to annotate the concept and methodology of forming a private cloud fabric with VMM 2012. Notice that throughout this article, I use the following pairs of terms interchangeably:

    • Application and service
    • User and consumer

    And this series includes:

    • Part 1. Private Cloud Concepts
    • Part 2. Fabric, Oh, Fabric (This article)
    • Part 3. Deployment with Service Template
    • Part 4. Working with Service Templates
    • Part 5. App Controller 

    Fabric in Windows Azure Platform: A Simplistic, Yet Remarkable View of Cloud imageIn cloud computing, fabric is a frequently used term. It is nevertheless not a product, nor a packaged solution that we can simply unwrap and deploy. Fabric is an abstraction, an architectural concept, and a state of manageability to conceptually denote the ability to discover, identify, and manage the lifecycle of instances and resources of a service. In an oversimplified analogy, fabric is a collection of hardware, software, wiring, configurations, profiles, instances, diagnostics, connectivity, and everything else that all together form the datacenter(s) where a cloud is running. While Fabric Controller (FC, a terminology coined by Windows Azure Platform) is also an abstraction to signify the ability and designate the authority to manage the fabric in a datacenter and all intendances and associated resources supported by the fabric. As far as a service is concerned, FC is the quintessential owner of fabric, datacenters, and the world, so to speak. Hence, without the need to explain the underlying physical and logical complexities in a datacenter of how hardware is identified and allocated, how a virtual machine (VM) is deployed to and remotely booted form bare-metal, how application code is loaded and initialized, how a service is started and reports its status, how required storage is acquired and allocated, and on and on, we can now summarize the 3,500-step process, for example, to bring up a service instance in Windows Azure Platform by virtually saying that FC deploy a service instance with fabric. Fundamentally a PaaS user expects is a subscribed runtime (or “platform” as preferred) environment is in place so cloud applications can be developed and run. And for an IaaS user, it is the ability to provision and deploy VMs on demand. How a service provider, in a private cloud setting that normally means corporate IT, makes PaaS and IaaS available is not a concern for either user. As a consumer of PaaS or IaaS, this is significantly helpful and allows a user to focus on what one really cares, which is a predictable runtime to develop applications and the ability to provision infrastructure as needed, respectively. In other words, what happens under the hood of cloud computing is collectively abstracted and gracefully presented to users as “fabric.” This simplicity brings so much clarity and elegance by shielding extraordinary, if not chaotic, technical complexities from users. The stunning beauty unveiled by this abstraction is just breathtaking.

    Fabric Concept and VMM 2012

    imageSimilar to what is in Windows Azure Platform, fabric in VMM 2012 is an abstraction to hide the underlying complexities from users and signify the ability to define and resources pools as a whole. This concept is explicitly presented in the UI of VMM 2012 admin console as shown here on the right. There should be no mystery at all what is fabric of a private cloud in VMM 2012. And a major task in the process of building a private cloud is to define/configure this fabric using VMM 2012 admin console. Specifically, there are 3 definable resource pools:

    • Servers (i.e. Compute)
    • Networking
    • Storage

    Clearly the magnitude and complexities are not on the same scale comparing the fabric in Windows Azure Platform in public cloud and that in VMM 2012 in private cloud. Further there are also other implementation details like replicating FC throughout geo-disbursed fabric, etc. not covered here to complicate the FC in Windows Azure Platform even more. The ideas of abstracting those details not relevant to what a user is trying to accomplish are nevertheless very much the same in both technologies. In a sense, VMM 2012 is a FC (in a simplistic form) of the defined fabric consisting of Servers, Networking, and Storage pools. And in these pools, there are functional components and logical constructs to collectively constitute the fabric of a private cloud.

    Servers Pool

    This pool embodies containers hosting the runtime execution resources of a service. Host groups contains virtualization hosts as the destinations where imagevirtual machines can be deployed based on authorization and service configurations. Library servers are the repositories of building blocks like images, iso files, templates, etc. for composing VMs. To automatically deploy images and boot a VM from bare-metal remotely via networks, pre-boot execution environment (PXE) servers are used to initiate the operating system installation on a physical computer. Update servers like WSUS are for servicing VMs automatically and based on compliance policies. For interoperability, VMM 2012 admin console can add VMware vCenter Servers to enable the management of VMware ESX hosts. And of course, the consoles will have visibility to all authorized VMM servers which forms the backbone of Microsoft virtualization management solution.

    Networking Pool

    In VMM 2012, the Networking pool is where to define logical networks, assign pools of static IPs and MAC addresses, integrate load balancers, etc. to mash up the fabric. Logical networks are user-defined groupings of IP subnets and VLANs to organize and simplify network assignments. For instance, HIGH, MEDIUM, and LOW can be the definitions of three logical networks such that real-time applications are connected with HIGH and batch processes with LOW based based on specified class of service. Logical networks provide an abstraction of the underlying physical infrastructure and enables an administrator to provision and isolate network traffic cablednetworkbased on selected criteria like connectivity properties, service-level agreements (SLAs), etc. By default, when adding a Hyper-V host to a VMM 2012 server, VMM 2012 automatically creates logical networks that match the first DNS suffix label of the connection-specific DNS suffix on each host network adapter.

    In VMM 2012, you can configure static IP address pools and static MAC address pools. This functionality enables you to easily allocate the addresses for Windows-based virtual machines that are running on any managed Hyper-V, VMware ESX or Citrix XenServer host. This feature gives much room for creativities in managing network addresses. VMM 2012 also supports adding hardware load balancers to the VMM console, and creating associated virtual IP (VIP) templates which contains load balancer-related configuration settings for a specific type of network traffic. Those readers with networking or load-balancing interests are highly encouraged to experiment and assess the networking features of VMM 2012.

    Storage Pool

    With VMM 2012 admin console, an administrator can discover, classify, and provision remote storage on supported storage arrays. VMM 2012 uses the new Microsoft Storage Management Service (installed by default during the installation of VMM 2012) to communicate with external arrays. An administrator must install a supported Storage Management Initiative – Specification (SMI-S) provider on an available server, followed by adding the provider to VMM 2012. SMI-S is a storage standard for operating among heterogeneous storage systems. VMM 2012 automates the assignment of storage to a Hyper-V host or Hyper-V host cluster, and tracks the storage that is managed by VMM.  Notice that storage automation through VMM 2012 is only supported for Hyper-V hosts.

    Where There Is A Private Cloud, There Are  IT Pros

    Aside from public cloud, private cloud, and something in between, the essence of cloud computing is fabric. And when it comes to a private cloud, it is largely about constructing/configuring fabric. VMM 2012 has laid it all out what fabric is concerning a private cloud and a prescriptive guidance of how to build it by populating the Servers, Networking, and Storage resource pools. I hope it is clear at this time that, particularly for a private cloud, forming fabric is not a programming commission, but one relying much on the experience and expertise of IT pros in building, operating, and maintaining an enterprise infrastructure. It’s about integrating IT tasks of building images, deploying VMs, automating processes, managing certificates, hardening securities, configuring networks, setting IPsec, isolating traffic, walking through traces, tuning performance, subscribing events, shipping logs, restoring tables, etc., etc., etc. with the three resource pools. And yes, it’s about what IT professionals do everyday to keep the system running. And that brings us to one conclusion.

    Private cloud is the future of IT pros. And let the truth be told “Where there is a private cloud, there are IT pros.”

    – See more at: http://blogs.technet.com/b/yungchou/archive/2011/08/29/system-center-virtual-machine-manager-vmm-2012-as-private-cloud-enabler-2-5-fabric-oh-fabric.aspx#sthash.xG3tXINR.dpuf

    New Web Part released – List Search Web Part now available!!

    The List Search Web Part reads the entries from a Sharepoint List or Library (located anywhere in the site collection) and displays the selected user fields in a grid with an optional interactive search filter.

    It can be used for WSS3.0, MOSS 2007, Sharepoint 2010 and Sharepoint 2013.

     Imagea

    The following parameters can be configured:

    • Sharepoint Site
    • List Columns to be displayed
    • Filtering, Grouping, Searching, Paging and Sorting of rows
    • AZ Index
    • optional Header text

    Installation Instructions:

    1. download the List Search Web Part Installation Instructions
    2. either install the web part manually or deploy the feature to your server/farm as described in the instructions. 
    3. Security Note:
      if you get the following error message: “Only an administrator may enumerate through all user profiles“, you will need to grant the application pool account(s) for the web application(s) „Manage User Profiles” permissions within the User Profile Sevice (SSP in case of MOSS2007).  
      This ensures that the application pool is able to retrieve the list of user profiles. 
      To assign this permission, access your active “User Profile Service” (SP 2010 Server ) or the “Shared Services Provider” (MOSS2007) via Central Admin. 
      From the „User Profiles and My Sites” group, click “Personalization services permissions”.  
      Add the „Manage User Profiles” permission to  your application pool account(s).
    4. Configure the following Web Part properties in the Web Part Editor “Miscellaneous” pane section as needed:
      • Site Name: Enter the name of the site that contains the List or Library:
        – leave this field empty if the List is in the current site (eg. the Web Part is placed in the same site)
        – Enter a “/” character if the List is contained in the top site
        – Enter a path if the List in in a subsite of the current site (eg. in the form of “current site/subsite”)
      • List Name: Enter the name of the desired Sharepoint List or Library
        Example: Project Documents
      • View Name: Optionally enter the desired List View of the list specified above. A List View allows you to specify specific data filtering and sorting. 
        Leave this field empty if you want to use the List default view.
      • Field Template: Enter the List columns to be displayed (separated by semicolons).
        Pictures can be attached (via File Upload) to the Sharepoint List items and displayed using the symbolic “Picture” column name.
        If you want to allow users to edit their own entries, please add the symbolic “Username” column name to the Field Template. An “Edit” symbol will then displayed to allow the user to navigate to the corresponding Edit Form:Example:
        Type;Name;Title;Modified;Modified By;Created By

        Friendly Header Names:
        If you would like to display a “friendly header name” instead of the default property name please append it to the User property, separated by the “|” pipe symbol.

        Example:
        Picture;LastName|Last Name;FirstName;Department;Email|Email Address

        Hiding individual columns:
        You can hide a column by prefixing it with a “!” character. 
        The following example hides the “Department” column: 
        LastName;FirstName;!Department;WorkEmail

        Suppress Column wrapping:
        You can suppress the wrapping of text inside a column by prefixing it with a “^” character.
        LastName;FirstName;Department;^AboutMe

        Showing the E-Mail address as plain text:
        You can opt to display the plain e-mail address (instead of the envelope icon) by appending “/plain” to the WorkEmail column:
        LastName;WorkEmail/plain;Department

      • Group By: enter an optional User property to group the rows.
      • Sort By: enter the List column(s) to define the default sort order. You can add multiple properties separated by commas. Append “/desc” to sort the column descending.
        Examples:
        Department
        Department,LastName
        Lastname
        /desc

        The columns headings can be clicked by the users to manually define the sort order.
      • AZ Index Column: enter an optional List column to display the AZ filter in the list header. 
        If an “!” character is appended to the property name, the “A” index will be forced when visiting the page.
        Example: LastName! 

         
         Image  
      • Search Box: enter one or more List columns (separated by semicolons) to allow for interactive searching.Example: LastName;FirstName

        If you want to display a search filter as a dropdown combo, please enter it with a leading “@” character:
        LastName;FirstName;Department;@Office

        Friendly Search Box Labels:
        If you would like to display a “friendly label” instead of the default property name please append it to the User property, separated by the “|” pipe symbol.
        Example:
        WorkPhone|Office Phone;Office|Office Nbr

         

      • Align Search Filters vertically: allows you to align the seach input boxes vertically to save horizontal space:
      • Rows per page: the Staff Directory web part supports paging and lets you specify the desired number of rows per page. 
      • Image Height: specify the image height in pixels if you include the “Picture” property. 
        Enter “0” if you want to use the default picture size.
      • Header Text: enter an optional header text. Please note that you can embed HTML tags if needed. You can additionally specify the text to be displayed if the “Show all entries” option is unchecked and the users has not performed a search yet by appending a “|” character followed by the text.
        Example:
        This is the regular header text|This text is only shown if the user has not yet performed a search
      • Detail View Page: enter an optional column name prefixed by “detailview=” to link a column to the item detail view page. Append the “/popup” option if you want to open the detail page in a Sharepoint 2010/2013 dialog popup window.
        Examples:
        detailview=LastName
        detailview/popup=Title
      • Alternating Row Color: enter the optional color of the alternating row background (leave blank to use default).
        Enter either the HTML color names (as eg. “red” etc.) or use hexadecimal RRGGBB coding (as eg. “#CCFFCC”). Enter the values without the double quotes.
        You can also change the default background color of the non-alternating rows by appending a second color value separated by a semicolon.
        Example: #ffffcc;#ffff99 

        The default Header style can be changed by adding the “AESD_Headerstyle” appSettings variable to the web.config “appSettings” section:

        <appSettings>
        <
        add key=AESD_Headerstyle value=background:green;font-size:10pt;color:white
         />
        <
        appSettings
        >

         

      • Show Column Headers: either show or suppress the List column header row.
      • Header Row CSS Style: enter the optionall header row CSS style(s) as needed.
        Example:
        color:blue;white-space:nowrap
      • Show Groups collapsed: either show the groups (if you specify a column in the “Group By” setting) collapsed or expanded when entering the page.
      • Enforce Security: hides the web part if user has no access to the site or the list. This avoids a login prompt if the user has not at least “View” permission on the list or site containing the list.
      • Show all entries: either show all directory entries or none when first visiting the page. 
        You can append a specific text to the “Header Text” field (see above) which is only displayed if this option is unchecked and no search has yet been performed by the user.
      • Open Links in new window: either open the links in a new window or in the same browser window.
      • Link Documents to Office365: open the Word, Excel and Powerpoint documents in the Office365 web viewer.
      • Show ‘Add New Item’ Button: either show or suppress the “Add new item button” to let users add new items to the list (this option is security-trimmed).
      • Export to CSV: Show/hide the “Export” button for Excel CSV File Export
      • CSV Separator: Enter the desired CSV field separator character (Default=Comma). Use a semicolon in countries which use the commas as a decimal separator.
      • Localization: enter the following 4 values (separated by semicolons) in your local language if you need to override the English strings corresponding to the 
        – Search button text, 
        – A..Z menu “View all” option, 
         the text displayed for Hyperlink columns 
        – the optional “Group By” name (if grouping is enabled)Default:
        Search;View all;Visit

      • License Key: enter your Product License Key (as supplied after purchase of the “Staff Directory Web Part” license key).
        Leave this field empty if you are using the free 30 day evaluation version.

     Contact me now at tomas.floyd@outlook.com for the List Search Web Part and other Free & Paid Web Parts and Apps for SharePoint 2010, 2013, Azure, Office 365, SharePoint Online

    Thoughts on : Customizing the Public Website of Office 365

    image_thumb

    blog-office365

     

    Recently, I attempted a migration from my ASP.NET based Azure website to Office 365. The reason was that I wanted to use SharePoint 2013 for in-page editing and simply try to get the platform to take care of all my business needs.

    After a few days, I have reverted back to the Azure web host as I am not satisfied that the service will fulfill my requirements. Here is a recollection of my experiences of the shortcomings in the platform and the points that should be addressed.

    Master page editing in the public Office 365 site is not much different from the rest of Office 365 and SharePoint 2013. You have access to the Design Manager and you can open the site with SharePoint Designer.

    lekman-365

    On the up-side, you can create master pages, create page layouts and add Rich Text areas using the “Multi-Area Page” that allows up to four separate rich text areas. I managed to get the site to look virtually the same when published.

    On the down-side, the page contained all the scripts and CSS styles from standard SharePoint and caused the responsive design to break for tablets and phones. I could probably have fixed some of the issues but the difference in page and load time is as follows:

    Azure .NET Office 365
    Total page weight 305.2K 727K
    Total non-cached file size 7.2K 54K
    Total number of script files 7 12
    Average page load time during load test 1.67 sec 3.46s

    I then amended the blog layout. The comments feature from blogs in standard SharePoint is not available so it uses Facebook instead. I replaced this with a Disqus control instead. Later on, I started running in to several issues when trying to add features.

    Issue #1: You cannot define your own content types

    The site administration does not contain a link to allow modification of content types or site fields. Trying to navigate to the URL manually presents you with a 403 error. Adding custom content types for your page layouts seems like a simple request. I then tried to inject these using sandbox solutions.

    Issue # 2: Sandboxed solutions are not supported

    Yes, this link is also gone. You cannot navigate to “Solutions” but you can manually enter the URL. I found a helpful and informative post by Jason Cribbet on the topic and was able to activate my feature. This is, however, not supported by Microsoft and I am now in “not supported” land with my website.

    Issue # 3: You cannot create subsites

    I was fairly happy until I started to create more content and restricted areas. There is no way to create subsites using the interface. You need to use SharePoint Designer. Again, this is not supported by Microsoft.

    Issue # 4: You cannot control feature activation

    Yes, features can not be changed either. This means that you cannot add or remove any functionality outside of apps to the site.

    Issue # 5: What is going on with the blog framework and managed navigation?

    I could live with the “hacks” and continued to style the blog area. This, in itself, has a number of very strange issues:

    • If you remove the “Blog Tools” web part from the page then the links to blog posts will not work.
    • The pages does not seem to understand changing page layouts. I first had to change the page layout, then disconnect the page from the layout in SharePoint Designer.
    • Managed navigation allows you to use the blog as “/Blog/Post/1/My-Blog-Title” and “/Blog/Date/2013/” etc. The page configuration, however, does not allow to be changed. If you rename a page then the entire navigation framework will stop working. Just don’t.
    • The blog and blog category lists can still be accessed using the forms URL at “/Lists/Posts/AllItems.aspx” and you cannot change the anonymous behavior. As you cannot change features, then the lockdown feature is out of bounds. I guess you can inject redirects on the pages or try to use PowerShell to reactivate the forms lockdown page feature but I did not attempt this.

    Issue # 6: You cannot recreate the site

    So finally, you have hacked this puppy to pieces. You want to recreate the site, you go into SharePoint administration for Office 365 and delete the site collection. But wait… there is no option to recreate the site? This rectified itself on my test tenant after 24 hours and allowed me to create the public site. It did, however, not fully recreate. Now the site has no web template applied and I get the error message “Sorry, something went wrong: There is no site in the current site subscription matching the HiddenSiteSelection control’s value.”.

    Summary

    Office 365 has a long way to go before it can offer any kind of enterprise solutions for public web. And in a sense, it seems that they are just about there but have intentionally limited themselves to support basic usage only. But if that was the case, why allow SharePoint Designer and Design Manager access at all?

    I hope that the public website will be improved in upcoming releases and would really like to run my site and blog using SharePoint technology.

    Virtualization vs. Private Cloud – What exactly is the difference? Part 1

    Virtualization vs. private cloud has confused many IT pros. Are they the same? Or different? In what way and how? We have already virtualized most of my computing resources, is a private cloud still relevant to us? These are questions I have been frequently asked. Before getting the answers, in the first article of the two-part series listed below I want to set a baseline.

    • Part 1: Cloud Computing Goes Far Beyond Virtualization (This article)
    • Part 2: A Private Cloud Delivers IT as a Service

    Lately, many IT shops have introduced virtualization into existing computing environment. Consolidating servers, mimicking production environment, virtualizing test networks, securing resources with honey pots, adding disaster recovery options, etc. are just a few applications of employing virtualization. Some also run highly virtualized IT with automation provided by system management solutions. I imagine many IT pros recognize the benefits of virtualization including better utilization of servers, associated savings by reducing the physical footprint, etc. Now we are moving into a cloud era, the question then becomes “Is virtualization the same with a private cloud?” or “We are already running a highly virtualized computing today, do we still need a private cloud? The answers to these questions should always start with “What business problems you are trying to address?” Then assess if a private cloud solution can fundamentally solve the problem, or perhaps virtualization is sufficient. This is of course assuming there is a clear understanding of what is virtualization and what is a private cloud. This point is that virtualization and cloud computing are not the same. They address IT challenges in different dimensions and operated in different scopes with different levels of impact on a business.

    Virtualization

    image_thumb6To make a long story short, virtualization in the context of IT is to “isolate” computing resources such that an object (i.e. an application, a task, a component) in a layer above can be possibly operated without a concern of those changes made in the layers below. A lengthy discussion of virtualization is beyond the scope of this article. Nonetheless,let me point out that the terms, virtualization, and “isolation” are chosen for specific reasons since there are technical discrepancies between “virtualization” and “emulation”, “isolation” and “redirection.” Virtualization isolates computing resources, hence offers an opportunity to relocate and consolidate isolated resources for better utilization and higher efficiency. Virtualization is rooted in infrastructure management, operations, and deployment flexibility. It’s about consolidating servers, moving workloads, streaming desktops, and so on; which without virtualization are not technically feasible or may simply be cost-prohibitive.

    Cloud Computing

    Cloud computing on the other hand is a state, a concept, a set of capabilities. There are statements made on what to expect in general from cloud computing. A definition of cloud computing published in NIST SP-800-145 outlines the essential characteristics, how to deliver, and what kind of deployment models to be cloud-qualified. Chou further simplifies it and offers a plain and simple way to describe cloud computing with the 5-3-2 Principle as illustrated below.

    image

     

    Unequivocally Different

    To realize the fundamental differences between virtualization and private cloud is therefore rather straightforward. In essence, virtualization is not based on the 5-3-2 Principle as opposed to cloud computing does. For instance, a self-serving model is not an essential component in virtualization, while it is essential in cloud computing. One can certainly argue some virtualization solution may include a self-serving component. The point is that self-service is not a necessary , nor sufficient condition for virtualization. While in cloud computing, self-service is a crucial concept to deliver anytime availability to user, which is what a service is all about. Furthermore, self-service is an effective mechanism to in the long run reduce training and support at all levels. It is a crucial vehicle to accelerate the ROI of a cloud computing solution and make it sustainable in the long run.

    So what are specifically about highly virtualized computing environment vs. a private cloud?

    Visio for Developers in Office 365

    In this post, I’ll introduce some of the new features of interest to developers in Visio 2013. Among these features are:

    • New file format
    • Robust updates to themes
    • The change shape feature (that allows you to replace one shape with another while Maintaining shape text)
    • New shape effects
    • Improvements to commenting
    • Coauthoring on SharePoint Server 2013
    • Customizable image clipping
    • Relative geometry
    • Support for Business Connectivity Services (BCS) data
    • Updates to Visio Services in Microsoft SharePoint Server 2013
    • Duplicate page feature

    At the end of the post, I provide you with some additional resources for both Visio and general Office development.

     

    New file format

    Visio 2013 introduces a new file format, based on the Open Packaging Conventions (OPC) standard (ISO 29500, Part 2) and the XML elements from the previous Visio XML file format (.vdx). It is a zipped, XML-based file format similar to the file formats used in other applications.

    Because the new file format is supported by both Visio 2013 and Visio Services in Microsoft SharePoint Server 2013, you can save a Visio drawing directly to a SharePoint Server library without having to first publish the file as a Visio Web Drawing (.vdw). Even so, Visio Services can still read and display Visio Web Drawing files.

    The new file format includes the following file types (by extension):

    • .vsdx (Visio drawing)
    • .vsdm (Visio macro-enabled drawing)
    • .vssx (Visio stencil)
    • .vssm (Visio macro-enabled stencil)
    • .vstx (Visio template)
    • .vstm (Visio macro-enabled template)

    By using existing support for reading and writing to the file format package (such as System.IO.Packaging) and for parsing XML (System.Xml.Linq), you can work with the new file formats programmatically.

    Visio 2013 retains the ability to read the old file formats (.vsd, .vss, .vst, .vdx, .vsx, .vtx, .vdw, .vwi). Visio 2013 does not save to the previous Visio XML file format (.vdx). Solutions or tools that consume the previous Visio XML file format (.vdx) files may need to be refactored in order to read the new file format and its schemas.

    Visio Services retains the ability to display the Visio Web Drawing (.vdw) format in the browser. It now also renders the new Visio drawing (.vsdx) and Visio macro-enabled drawing (.vsdm) formats.

    For more information about the new file format, see the article How to: Manipulate the Visio 2013 file format programmatically.

    Themes

    Themes have been redesigned in Visio 2013, making use of a greater variety of effects and styles including the integration of Shape Art effects. Users can now decide on an overarching style by applying a theme, personalize the diagram with theme variants, and highlight individual shapes with Quick Styles. ShapeSheet developers can take advantage of these features with new functions and cells in the ShapeSheet.

    The user interface for applying theme variants is shown in the following figure.

     

     

    You can also manipulate themes at the Page, Shape, and Selection object level. New APIs for working with themes include Page.SetTheme method, Page.SetThemeVariant method, Shape.SetQuickStyle method, and the Selection.SetQuickStyle method.

    For more information about new VBA objects and members in Visio 2013, see the Visio Automation reference. For more information about the new ShapeSheet cells in Visio 2013, see the article What’s new for ShapeSheet developers in Visio 2013.

    Change Shape

    Visio 2013 includes a shape replacement API that enables you to swap one or more shapes for another shape contained in a stencil, while retaining some of the local values from the original shape, like the shape text shape, shape data, or shape formatting. Shape developers can update the ShapeSheet settings of their custom shapes to specify the Change Shape behavior for their shapes. Among the new APIs for Change Shape are the Shape.ReplaceShape and Selection.ReplaceShape methods and the ReplaceShapesEvent object.

    The Change Shape feature lets you easily change a shape (in this case, the green rectangle)…

     

    …to another shape, the green diamond.

    For more information about the Change Shape feature, see Eric Schmidt’s blog post, Change shapes in Visio 2013.

    For more information about new VBA objects and members in Visio 2013, see the Visio Automation reference. For more information about the new ShapeSheet cells in Visio 2013, see the article What’s new for ShapeSheet developers in Visio 2013.

    Shape effects

    New shape effects such as bevel, 3-D rotation, glow, reflection, and sketching have been added to Visio 2013. The ShapeSheet includes new cells for working with these effects. The following figure shows a shape to which effects have been applied.

    You can also use Office VBA objects such as TextFrame2, GlowFormat, and ReflectionFormat and their members to apply shape effects.

    For more information about the new ShapeSheet cells in Visio 2013, see the article What’s new for ShapeSheet developers in Visio 2013.

    Commenting

    Visio 2013 includes a new commenting framework. Comments can now be associated with a particular shape or page. Visio 2013 includes two new objects, Comments and Comment. New APIs for accessing comments programmatically include the Document.Comments, Page.Comments, Shape.Comments, and Page.ShapeComments properties.

    The following images show what comments looked like in Visio 2010 and what they look like in Visio 2013.

     

     

    Visio Services includes JavaScript APIs to read the comments from a page or shape in a diagram.

    Note: You can no longer access comments in the ShapeSheet.

    Coauthoring

    Visio 2013 includes the ability to co-author diagrams stored on SharePoint or OneDrive. Developers have access to the Document.AfterDocumentMerge event which provides information about diagram changes due to coauthoring. Solution developers also have the ability to disable coauthoring to suit their custom needs by using the NoCoauth cell on the Document ShapeSheet.

    Customizable image clipping

    Visio 2013 supports defining a Custom Image Clipping path to crop images to any shape. This extends the capacities of Visio 2010, which supported clipping images in a rectangular way. This functionality is available in the ShapeSheet by using the ClippingPath cell in the Foreign Image Info section.

    Relative geometries

    In previous versions of Visio, shape geometry was defined by formulas that depended on the height or width of the shape. For example, in Visio 2010 the vertices of many built-in Visio shapes were defined by multiplying the height or width of the shape by a constant. These shapes had Geometry sections that included MoveTo or LineTo rows (for example) with formulas like Width1 and Height0.

    Visio 2013 now supports relative geometry in the ShapeSheet. Shape developers can now use relative geometries to specify geometries as simple values or formulas, which multiply by the height or width automatically. You can now express Shape vertices by using constants, for instance—you no longer need to express vertices as multiples of the shape width or height. This makes it easier for you to create shapes that have better performance and smaller file sizes. New rows include the RelMoveTo and RelLineTo rows where the X and Y cell values are automatically multiplied by the width or height of the shape (respectively).

    Support for Business Connectivity Services (BCS) data

    Visio 2013 diagrams can now be connected to external lists on SharePoint Server 2013 servers. An external list is a content source external to SharePoint (for example, a SQL Server table) that has been connected to a SharePoint list by using Microsoft Business Connectivity Services (BCS). Visio Services supports the ability to refresh the Visio diagrams as the data updates.

    For more information about what’s new in Visio Services, see the article Visio Services in SharePoint 2013. For more information about Business Connectivity Services (BCS), see Business Connectivity Services in SharePoint 2013.

    Improvements in Visio Services

    Visio Services in Microsoft SharePoint Server 2013 includes many improvements. As mentioned previously, Visio Services supports the new Visio file format (.vsdx and .vsdm). Visio Services has expanded data refresh and recalculation, including the ability to recalculate formulas across an entire diagram.

    For more information about what’s new in Visio Services, see the article Visio Services in SharePoint 2013.

    Duplicate page

    You can now copy a page and all of its shapes within the same document in Visio 2013. Accordingly, the Page object has a new method, Duplicate, which duplicates the page and returns a new Page object.

    Additional resources

    Brand new 3 LINQ to Office Providers Available now!!

    The SPSamurai.Office.LINQ namespace contains 3 classes –

    OutlookProvider(LINQ to Outlook), OneNoteProvider (LINQ to OneNote) and ExcelProvider(LINQ to Excel).

    The OutlookProvider is a wrapper class which provides IEnumerable collections to data of the COM interface of Outlook ( appointments, contacts, mails, tasks, …).

    The OneNoteProvider provides collections of notebooks, sections and pages by manipulating the XML hierarchy tree of OneNote. And the ExcelProvider loads an Excel worksheet and provides column definition and row collections.

    All collections are IEnumerable so you can query them with LINQ. The full source code is provided.

    Check out my articles where I describe the implementation of these 3 classes and how to use them. These articles also contain a lot of LINQ query examples.

    Class diagrams:

     

     

     

     

     

    Features :
    Set flag with due date from predefined list: Today, Tomorrow, This Week, Next Week or Custom  
    Different options of follow-up visualization using combinations of flag, text and date  
    Support of sorting and filtering features  
    Support of different calendars (Gregorian, Japanese Emperor Era, Korean Tangun Era, Hijri, etc.)  
    Supported Datasheet view  
    Two-way conversion between ArtfulBits Follow-Up and standard Microsoft® SharePoint® Date and Time column  
    Language pack support (desired localization could be added by request)  

     

    Contact me at tomas.floyd@outlook.com for these tools and more SharePoint, Azure and Office 365 Apps, Tools and Web Parts or for specialised custom SharePoint Development

    How To : Use Powershell Scripts in Office 365 through the SharePoint CSOM

    When we first started to work with Office 365, I remember being quite concerned at the lack of PowerShell cmdlets – basically all the commands we’re used to using do not exist there. Here’s a gratuitous graph to illustrate the point:

    image

    So yes, nearly 800 PowerShell commands in SP2013 (up from around 530 in SP2010) down to a measly 30 in SharePoint Online. And those 30 mainly cover basic operations with sites, users and permissions – no scripting of, say, Managed Metadata, user profiles, search and so on. It’s true to say that some of these things are now available down at site-collection scope (needed, of course, when you don’t have a true “Central Admin” site but there are still “tenant-level” settings that you want to use script for rather than make manual changes through the UI.

    So what’s a poor developer/administrator to do?

    The answer is to write PowerShell as you always did, but embed CSOM code in there. More examples later, but here’s a small illustration:

    # get the site collection scoped Features collections (e.g. to activate one) – not showing how to obtain $clientContext here..
    $siteFeatures = $clientContext.Site.Features
    $clientContext.Load($siteFeatures)
    $clientContext.ExecuteQuery()

    So we’re using the .NET CSOM, but instead of C# we are using PowerShell’s ability to call any .NET object (indeed, nearly every script will use PowerShell’s New-Objectcommand). All the things we came to love about PowerShell are back on the table:

    • Scripts can be easily amended, no need to recompile (or open Visual Studio)
    • We can debug with PowerGui or PowerShell ISE
    • We can leverage other things PowerShell is good at e.g. easily reading from XML files, using other PowerShell modules and other APIs (including .NET) etc.

    Of course, we can only perform operations where the method exists in the .NET CSOM – that’s the boundary of what we can do.

    Getting started

    Step 1 – understand the landscape

    The first thing to understand is that there are actually 3 different approaches for scripting against Office 365/SharePoint Online, depending on what you need to do. It might just be me, but I think that when you start it’s easy to get confused between them, or not fully appreciate that they all exist. The 3 approaches I’m thinking of are:

    • SharePoint Online cmdlets
    • MSOL cmdlets
    • PowerShell + CSOM

    This post focuses on the last flavor. I also wrote a short companion post about the overall landscape and with some details/examples on the other flavors, at Using SharePoint Online and MSOL cmdlets in PowerShell with Office 365

    Step 2 – prepare the machine you will run scripts against SharePoint Online

    Option 1 – if you will NOT run scripts from a SP2013 box (e.g. a SP2013 VM):

    You need to obtain the SharePoint DLLs which comprise the .NET CSOM, and copy them to a folder on your machine – your scripts will reference these DLLs.

    1. Go to any SharePoint 2013 server, and copy any DLL
    2. which starts with Microsoft.SharePoint.Client*.dll from the C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI folder.
    3. Store them in a folder on your machine e.g. C:\Lib – make a note of this location.

    CSOM DLLs

    Option 2 – if you WILL run scripts from a SP2013 box (e.g. a SP2013 VM):

    In this case, there is no need to copy the DLLs – your scripts will reference them in the original SharePoint install location (C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI).

    The top of your script – referencing DLLs and authentication

    Each .ps1 file which calls the SharePoint CSOM needs to deal with two things before you can use the API – loading the CSOM types, and authenticating/obtaining a ClientContext object. So, you’ll need this at the top of your script:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    # replace these details (also consider using Get-Credential to enter password securely as script runs)..
    $username = “SomeUser@SomeOrg.onmicrosoft.com”
    $password = “SomePassword”
    $securePassword = ConvertTo-SecureString $Password -AsPlainText -Force
    # the path here may need to change if you used e.g. C:\Lib..
    Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll”
    Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll”
    # note that you might need some other references (depending on what your script does) for example:
    Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Taxonomy.dll”
    # connect/authenticate to SharePoint Online and get ClientContext object..
    $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($url)
    $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword)
    $clientContext.Credentials = $credentials
    if (!$clientContext.ServerObjectIsNull.Value)
    {
    Write-Host “Connected to SharePoint Online site: ‘$Url'” -ForegroundColor Green
    }
    view rawTopOfScript_PSCSOM.ps1 hosted with ❤ by GitHub

    In the scripts which follow, we’ll include this “top of script” stuff by dot-sourcing TopOfScript.ps1 in every script below – you could follow a similar approach (perhaps with a different name!) or simply paste that stuff into every script you create. If you enter a valid set of credentials and URL, running the script above should see you ready to rumble:

    PS CSOM got context

    Script examples

    Activating a Feature in SPO

    Something you might want to do at some point is enable or disable a Feature using script. The script below, like the others that follow it, all reference my TopOfScript.ps1 script above:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    . .\TopOfScript.ps1
    [bool]$enable = $true
    [bool]$force = $false
    # using the Minimal Download Strategy Feature here..
    $FeatureId = [GUID](“87294C72-F260-42f3-A41B-981A2FFCE37A”)
    # ..and working with the web-scoped Features – use $clientContext.Site.Features for site-scoped Features
    $webFeatures = $clientContext.Web.Features
    $clientContext.Load($webFeatures)
    $clientContext.ExecuteQuery()
    if ($enable)
    {
    $webfeatures.Add($featureId, $force, [Microsoft.SharePoint.Client.FeatureDefinitionScope]::None)
    }
    else
    {
    $webfeatures.Remove($featureId, $force)
    }
    try
    {
    $clientContext.ExecuteQuery()
    if ($enable)
    {
    Write-Host “Feature ‘$FeatureId’ successfully activated..”
    }
    else
    {
    Write-Host “Feature ‘$FeatureId’ successfully deactivated..”
    }
    }
    catch
    {
    Write-Error “An error occurred whilst activating/deactivating the Feature. Error detail: $($_)
    }
    view rawActivateOrDeactivateFeature_PSCSOM.ps1 hosted with ❤ by GitHub

    PS CSOM activate feature

    Enable side-loading (for app deployment)

    Along very similar lines (because it also involves activating a Feature), is the idea of enabling “side-loading” on a site. By default, if you’re developing a SharePoint app it can only be F5 deployed from Visual Studio to a site created from the Developer Site template, but by enabling “side-loading” you can do it on (say) a team site too. Since the Feature isn’t visible (in the UI), you’ll need a script like this:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    . .\TopOfScript.ps1
    [bool]$enable = $true
    [bool]$force = $false
    # this is the side-loading Feature ID..
    $FeatureId = [GUID](“AE3A1339-61F5-4f8f-81A7-ABD2DA956A7D”)
    # ..and this one is site-scoped, so using $clientContext.Site.Features..
    $siteFeatures = $clientContext.Site.Features
    $clientContext.Load($siteFeatures)
    $clientContext.ExecuteQuery()
    if ($enable)
    {
    $siteFeatures.Add($featureId, $force, [Microsoft.SharePoint.Client.FeatureDefinitionScope]::None)
    }
    else
    {
    $siteFeatures.Remove($featureId, $force)
    }
    try
    {
    $clientContext.ExecuteQuery()
    if ($enable)
    {
    Write-Host “Feature ‘$FeatureId’ successfully activated..”
    }
    else
    {
    Write-Host “Feature ‘$FeatureId’ successfully deactivated..”
    }
    }
    catch
    {
    Write-Error “An error occurred whilst activating/deactivating the Feature. Error detail: $($_)
    }
    view rawEnableSideLoading_PSCSOM.ps1 hosted with ❤ by GitHub

    PS CSOM enable side loading

    Iterating webs

    Sometimes you might want to loop through all the webs in a site collection, or underneath a particular web:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    1234567891011121314151617181920
    . .\TopOfScript.ps1
    $rootWeb = $clientContext.Web
    $childWebs = $rootWeb.Webs
    $clientContext.Load($rootWeb)
    $clientContext.Load($childWebs)
    $clientContext.ExecuteQuery()
    function processWeb($web)
    {
    $lists = $web.Lists
    $clientContext.Load($web)
    $clientContext.ExecuteQuery()
    Write-Host “Web URL is” $web.Url
    }
    foreach ($childWeb in $childWebs)
    {
    processWeb($childWeb)
    }
    view rawIterateWebs.ps1 hosted with ❤ by GitHub

    PS CSOM iterate webs

    (Worth noting that you also see SharePoint-hosted app webs also in the image above, since these are just subwebs (albeit ones which get accessed on the app domain URL rather than the actual host site’s web application URL).

    Iterating webs, then lists, and updating a property on each list

    Or how about extending the sample above to not only iterate webs, but also the lists in each – the property I’m updating on each list is the EnableVersioning property, but you easily use any other property or method in the same way:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    . .\TopOfScript.ps1
    $enableVersioning = $true
    $rootWeb = $clientContext.Web
    $childWebs = $rootWeb.Webs
    $clientContext.Load($rootWeb)
    $clientContext.Load($childWebs)
    $clientContext.ExecuteQuery()
    function processWeb($web)
    {
    $lists = $web.Lists
    $clientContext.Load($web)
    $clientContext.Load($lists)
    $clientContext.ExecuteQuery()
    Write-Host “Processing web with URL “ $web.Url
    foreach ($list in $web.Lists)
    {
    Write-Host “– “ $list.Title
    # leave the “Master Page Gallery” and “Site Pages” lists alone, since these have versioning enabled by default..
    if ($list.Title -ne “Master Page Gallery” -and $list.Title -ne “Site Pages”)
    {
    Write-Host “—- Versioning enabled: “ $list.EnableVersioning
    $list.EnableVersioning = $enableVersioning
    $list.Update()
    $clientContext.Load($list)
    $clientContext.ExecuteQuery()
    Write-Host “—- Versioning now enabled: “ $list.EnableVersioning
    }
    }
    }
    foreach ($childWeb in $childWebs)
    {
    processWeb($childWeb)
    }
    view rawIterateWebsAndListsEnableVersioning.ps1 hosted with ❤ by GitHub

    PS CSOM iterate lists enable versioning

    Import search schema XML

    In SharePoint 2013 and Office 365, many aspects of search configuration (such as Managed Properties and Crawled Properties, Query Rules, Result Sources and Result Types) can be exported and importing between environments as an XML file. The sample below shows the import operation handled with PS + CSOM:

    ** N.B. My newer code samples do not show in RSS Readers – click here for full article **
    . .\TopOfScript.ps1
    # need some extra types bringing in for this script..
    Add-Type -Path “c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Search.dll”
    # TODO: replace this path with yours..
    $pathToSearchSchemaXmlFile = “C:\COB\Cloud\PS_CSOM\XML\COB_TenantSearchConfiguration.xml”
    # we can work with search config at the tenancy or site collection level:
    #$configScope = “SPSiteSubscription”
    $configScope = “SPSite”
    $searchConfigurationPortability = New-Object Microsoft.SharePoint.Client.Search.Portability.SearchConfigurationPortability($clientContext)
    $owner = New-Object Microsoft.SharePoint.Client.Search.Administration.SearchObjectOwner($clientContext, $configScope)
    [xml]$searchConfigXml = Get-Content $pathToSearchSchemaXmlFile
    $searchConfigurationPortability.ImportSearchConfiguration($owner, $searchConfigXml.OuterXml)
    $clientContext.ExecuteQuery()
    Write-Host “Search configuration imported” -ForegroundColor Green
    view rawImportSearchSchema.ps1 hosted with ❤ by GitHub

    PS CSOM import search schema

    Summary

    As you can hopefully see, there’s lots you can accomplish with the PowerShell and CSOM combination. Anything that can be done with CSOM API can be wrapped into a script, and you can build up a library of useful PowerShell snippets just like the old days. There are some interesting things that you CANNOT do with CSOM (such as automating the process of uploading/deploying a sandboxed WSP to Office 365), but there ARE approaches for solving even these problems, and I’ll most likely cover this (and our experiences) in future posts.

    A final idea on the PowerShell + CSOM front is the idea that you can have “hybrid” scripts which can deal with both SharePoint Online and on-premises SharePoint. For example, on my current project everything we build must be deployable to both SPO and on-premises, and our scripts take a “DeploymentTarget” parameter where the values can be “Online” or “OnPremises”. There are some differences (i.e. branching) in the scripts, but for many operations the same commands can be run.

    Client-side PowerShell for SharePoint Online and Office 365

    SharePoint PowerShell is a PowerShell API for SharePoint 2010, 2013 and Online. Very usefull for Office 365 and private clouds where you don’t have access to the physical server.

    Image

    The API uses the Managed .NET Client-Side Object Model (CSOM) of SharePoint 2013. It’s a library of PowerShell Scripts and in it’s core it talks to the CSOM dll’s.

    Examples :

    Import-Module .\spps.psm1 
    
    Initialize-SPPS -siteURL "https://example.sharepoint.com/" -online $true -username "sitecollectionadmin@example.onmicrosoft.com" -password "password"
    Example
    # Include SPPS
    Import-Module .\spps.psm1 
    
    # Setup SPPS
    Initialize-SPPS -siteURL "https://example.sharepoint.com/" -online $true -username "sitecollectionadmin@example.onmicrosoft.com" -password "password"
    
    # Activate Publishing Site Feature
    Activate-Feature -featureId "f6924d36-2fa8-4f0b-b16d-06b7250180fa" -force $false -featureDefinitionScope "Site"
    
    #Activate Publishing Web Feature
    Activate-Feature -featureId "94c94ca6-b32f-4da9-a9e3-1f3d343d7ecb" -force $false -featureDefinitionScope "Web"

    Features

    • Site Collection
      • Test connection
    • Site
      • Manage subsites
      • Manage permissions
    • Lists and Document Libraries
      • Create list and document library
      • Manage fields
      • Manage list and list item permissions
      • Upload files to document library (including folders)
      • Add items to a list with CSV
      • Add and remove list items
      • File check-in and check-out
    • Master pages
      • Set system master page
      • Set custom master page
    • Features
      • Activate features
    • Web Parts
      • Add Web Parts to page
    • Users and Groups
      • Set site permissions
      • Set list permissions
      • Set document permissions
      • Create SharePoint groups
      • Add users and groups to SharePoint groups
    • Solutions
      • Upload sandboxed solutions
      • Activate sandboxed solutions

      Contact me at tomas.floyd@outlook.com for this and more Azure,SharePoint & Office 365 Tools, Web Parts and Apps

    SharePoint Development roles urgently needs to be filled at MS Gold Partner – Contact me now for more information (Sorry, No recruiters, i am filling private positions)

    Senior SharePoint Developers needed urgently for MS Gold Partner in Sandton/Bryanston :

    3 – 5 years of development experience.

    2 year experience in SharePoint.

    3 years experience in C#.

    A minimum of 3 years experience in Visual Studio .NET 2005 – 2008.

    A minimum of 3 years experience in ASP.NET , HTML web development.

    A minimum of 3 years experience with Javascript.

    A minimum of 3 years experience with Windows XP, Windows 2003 and Windows Vista.

    A minimum of 3 years experience in relational database design and implementation with SQL Server
    Advantageous (nice-to-have):

    • Windows SharePoint Server.
    • Microsoft Office SharePoint Server.
    • BizTalk
    • Web Analytics
    • Microsoft CRM
    • K2

    Getting Started with Apps for Office : The Javascript API for Office

    This section briefly describes the subset of the JavaScript API for Office you can call from content and task pane apps. See Understanding the JavaScript API for Office for an overview of the features of the entire API, and Apps for Office code samples for additional examples.

    Before reading this section, use the links below to explore API diagrams that show the members of the API supported in content and task pane apps and the Office host applications that support these app types.

    Explore by app type: Explore by host application:
    Zoom into the Office object model for content apps Content apps

    ZoomIt

    Zoom into the object model for task pane apps Task pane apps

    ZoomIt

    Download the set of maps

    for each app type and host application.

    Zoom into the app object model for Excel Excel

    ZoomIt

    Zoom into the app object model for PowerPoint PowerPoint

    ZoomIt

    Zoom into the app object model for Project Project

    ZoomIt

    Zoom into the app object model for Word Word

    ZoomIt

    You can categorize the primary objects and methods supported by content and task pane apps as follows:

    1. Common objects shared with other apps for Office

      These objects include Office, Context, and AsyncResult. The Office object is the root object of the JavaScript API for Office. The Context object represents the app’s runtime environment. Both Office and Context are the fundamental objects for any app for Office. The AsyncResult object represents the results of an asynchronous operation, such as the data returned to the getSelectedDataAsync method, which reads what a user has selected in a document.

    2. The Document object

      The majority of the API available to content and task pane apps is exposed through the methods, properties, and events of the Document object. Using this subset of the API, your content or task pane app can perform the tasks described later in this topic.

      A content or task pane app can use the Office.context.document property to access the Document object, and through it, can access the key members of the API for working with data in documents, such as the Bindings and CustomXmlParts objects, and the getSelectedDataAsync, setSelectedDataAsync, and getFileAsync methods. The Document object also provides the mode property for determining whether a document is read-only or in edit mode, the url property to get the URL of the current document, and access to the Settings object. The Document object also supports adding event handlers for the SelectionChanged event, so you can detect when a user changes his or her selection in the document.

      A content or task pane app can access the Document object only after the DOM and run-time environment has been loaded, typically in the event handler for the Office.initialize event. For information about the flow of events when an app is initialized, and how to check that the DOM and runtime and loaded successfully, see Loading the DOM and runtime environment.

    3. Objects for working with specific features

      To work with specific features of the API, your content or task pane app can work with the following objects and methods:

      • Use the methods of the Bindings object to create or get bindings, and then work with their data using the methods and properties of the Binding object.
      • Use the CustomXmlParts, CustomXmlPart and associated objects to create and manipulate custom XML parts in Word documents.
      • Use the File and Slice objects to create a copy of the entire document, break it into chunks or “slices”, and then read or transmit the data in those slices.
      • Use the Settings object to save custom data, such as user preferences, and app state.

    Important: Some of the API members described in this topic aren’t supported across all Office applications that can host content and task pane apps. To determine which members are supported, see any of the following resources:

    For a high-level summary of the JavaScript API for Office support available across Office host applications, see the API support matrix in the “Understanding the JavaScript API for Office” topic.

    The following sections highlight the fundamental concepts for creating content and task pane apps for Word, Excel, PowerPoint, and Project. For more details about a concept, see the references at the end of the concept, and also the Additional resources section.

    You can read or write to the user’s current selection in a document, spreadsheet, or presentation. Depending on the host application for your app, you can specify the type of data structure to read or write as a parameter in the getSelectedDataAsync and setSelectedDataAsync methods of the Document object. For example, you can specify any type of data (text, HTML, tabular data, or Office Open XML) for Word, text and tabular data for Excel, and text for PowerPoint and Project. You can also create event handlers to detect changes to the user’s selection. The following example gets data from the selection as text using the getSelectedDataAsync method.

    Copy
    Office.context.document.getSelectedDataAsync(
        Office.CoercionType.Text, function (asyncResult) {
            if (asyncResult.status == Office.AsyncResultStatus.Failed) {
                write('Action failed. Error: ' + asyncResult.error.message);
            }
            else {
                write('Selected data: ' + asyncResult.value);
            }
        });
    
    // Function that writes to a div with id='message' on the page.
    function write(message){
        document.getElementById('message').innerText += message; 
    }

    For more details and examples, see Reading and writing data to the active selection in a document or spreadsheet.

    As described in the previous section, you can use the getSelectedDataAsync and setSelectedDataAsync methods to read or write to the user’s current selection in a document, spreadsheet, or presentation. However, if you would like to access the same region in a document across sessions of running your app without requiring the user to make a selection, you should first bind to that region. You can also subscribe to data and selection change events for that bound region.

    You can add a binding by using addFromNamedItemAsync, addFromPromptAsync, or addFromSelectionAsync methods of the Bindings object. These methods return an identifier that you can use to access data in the binding, or to subscribe to its data change or selection change events.

    The following is an example that adds a binding to the currently selected text in a document, by using the Bindings.addFromSelectionAsync method.

    Copy
    Office.context.document.bindings.addFromSelectionAsync(
        Office.BindingType.Text, { id: 'myBinding' }, function (asyncResult) {
        if (asyncResult.status == Office.AsyncResultStatus.Failed) {
            write('Action failed. Error: ' + asyncResult.error.message);
        } else {
            write('Added new binding with type: ' +
                asyncResult.value.type + ' and id: ' + asyncResult.value.id);
        }
    });
    
    // Function that writes to a div with id='message' on the page.
    function write(message){
        document.getElementById('message').innerText += message; 
    }

    For more details and examples, see Binding to regions in a document or spreadsheet.

    If your task pane app runs in PowerPoint or Word, you can use the Document.getFileAsync, File.getSliceAsync, and File.closeAsync methods to get an entire presentation or document.

    When you call Document.getFileAsync, you get a copy of the document in a File object. The File object provides access to the document in “chunks” represented as Slice objects. When you call getFileAsync, you can specify the file type (text or compressed Open Office XML format), and size of the slices (up to 4MB). To access the contents of the File object, you then call File.getSliceAsync which returns the raw data in the Slice.data property. If you specified compressed format, you will get the file data as a byte array. If you are transmitting the file to a web service, you can transform the compressed raw data to a base64-encoded string before submission. Finally, when you are finished getting slices of the file, use the File.closeAsync method to close the document.

    For more details, see how to get the whole document from an app for PowerPoint or Word.

    Using the Open Office XML file format and content controls, you can add custom XML parts to a Word document and bind elements in the XML parts to content controls in that document. When you open the document, Word reads and automatically populates bound content controls with data from the custom XML parts. Users can also write data into the content controls, and when the user saves the document, the data in the controls will be saved to the bound XML parts. Task pane apps for Word, can use the Document.customXmlParts property, CustomXmlParts, CustomXmlPart, and CustomXmlNode objects to read and write data dynamically to the document.

    Custom XML parts may be associated with namespaces. To get data from custom XML parts in a namespace, use the CustomXmlParts.getByNamespaceAsync method.

    You can also use the CustomXmlParts.getByIdAsync method to access custom XML parts by their GUIDs. After getting a custom XML part, use the CustomXmlPart.getXmlAsync method to get the XML data.

    To add a new custom XML part to a document, use the Document.customXmlParts property to get the custom XML parts that are in the document, and call the CustomXmlParts.addAsync method.

    For detailed information about how to work with custom XML parts with a task pane app, see Creating Better Apps for Word with Office Open XML.

    Often you need to save custom data for your app, such as a user’s preferences or the app’s state, and access that data the next time the app is opened. You can use common web programming techniques to save that data, such as browser cookies or HTML 5 web storage. Alternatively, if your app runs in Excel, PowerPoint, or Word, you can use the methods of the Document.Settings object. Data created with the Settings object is stored in the spreadsheet, presentation, or document that the app was inserted into and saved with. This data is available to only the app that created it.

    To avoid roundtrips to the server where the document is stored, data created with the Settings object is managed in memory at runtime. Previously saved settings data is loaded into memory when the app is initialized, and changes to that data are only saved back to the document when you call the Settings.saveAsync method. Internally, the data is stored in a serialized JSON object as name/value pairs. You use the get, set, and remove methods of the Settings object, to read, write, and delete items from the in-memory copy of the data. The following line of code shows how to create a setting named themeColor and set its value to ‘green’.

    Copy
    Office.context.document.settings.set('themeColor', 'green');

    Because settings data created or deleted with the set and remove methods is acting on an in-memory copy of the data, you must call saveAsync to persist changes to settings data into the document your app is working with.

    For more details about working with custom data using the methods of the Settings object, see Persisting app state and settings.

    If your task pane app runs in Project, your app can read data from some of the project fields, resource, and task fields in the active project. To do that, you use the methods and events of the ProjectDocument object which extends the Document object to provide additional Project-specific functionality.

    For examples of reading Project data, see How to: Create your first task pane app for Project 2013 by using a text editor

    Your app uses the Permissions element in its manifest to request permission to access the level of functionality it requires from the JavaScript API for Office. For example, if your app requires read/write access to the document, its manifest must specify ReadWriteDocument as the text value in its Permissions element. Because permissions exist to protect a user’s privacy and security, as a best practice you should request the minimum level of permissions it needs for its features. The following example shows how to request the ReadDocument permission in a task pane’s manifest.

    Copy
    <!--?xml version="1.0" encoding="utf-8"?>
    
    …
      ReadDocument
    …

    Figure 1 shows the 5 levels of permissions that you can specify for a task pane app. For more information, see Requesting permissions for task pane apps.

    Figure 1. The 5-level permission model for task pane apps

    Levels of permissions for task pane apps

    Figure 2 shows the 4 levels of permissions available to a content app. For more information, see Requesting permissions for content apps.

    Figure 2. The 4-level permission model for content apps

    Levels of permissions for content apps

    Changing Site Access Request Email in SharePoint 2013 (Office 365)

    The option to set the for any SharePoint site access requests has moved around in the last few versions, so I thought I’d post this for those searching through old posts looking for one about SharePoint 2013.

    This setting determines who will receive an email when a user requests access to a particular site—usually when the user tries to access the site and is denied. The tricky part is that the for this request is not related to the site owner permissions, it’s just a string.

    To find the setting, navigate to:

    Site (Gear icon on top-right) > Site permissions > Access Request Settings (in the ribbon)

    First use the gear icon in the top-right corner to get to the Site Settings page. If you don’t see the Site Settings link, you probably don’t have sufficient rights to make this change.

    image

    Once there, click on Site permissions.

    SNAGHTMLf46205b

    This will open the “Permissions: <site name>” page where you can access the Access Request option from the ribbon at the top of the screen. The option you’re changing is “Send all access requests to the following e-mail address.”

    SNAGHTMLf4595d8

    Simply the email address you’d like to use for access requests and you’re done.

    Troubleshooting issues with the “Eligible MSDN Subscriber” license type

    As you adopt Visual Studio Online (VSO) and assign licenses to your users you may want to assign the “Eligible MSDN Subscriber” license type to team members. MSDN subscriptions are purchased outside of VSO and assigned to individual users. Before an MSDN subscriber can log in to VSO as an eligible MSDN subscriber, the subscription process must first be completed. The general flow looks like this:

    Image

    • MSDN subscription is purchase by or assigned to a team member. Assignment could be via the Volume Licensing Service Center, MPN, etc.
    • Team member receives an email asking them to activate their subscription by signing in with a Microsoft account and registering with the subscription details provided.
    • Team member activates the subscription and associates a Microsoft account (MSA) with the subscription. The team member logs in to msdn.microsoft.com/en-us/subscriptions/manage using this MSA to manage the subscription going forward.
    • A VSO admin adds the MSA (the one the team member associated with their subscription in the previous step) to the Users Hub (https://Contoso.visualstudio.com/_user) and assigns them an “Eligible MSDN Subscriber” license.
    • If 24-48 hours have passed since the subscription was activated, the team member should be able to log in to VSO as an “Eligible MSDN Subscriber” and enjoy full access to all VSO features (a benefit of this license level).

     

    Sometimes though the team member may run into problems after this process. Specifically, they may try to log in and receive an error: 

     

    Figure 1: (403 Forbidden \ …does not have license rights to access this account \ VSS012019: No MSDN subscription found for this user)

     

    The team member could be seeing this error for a number of reasons: 

    1. The specific MSDN subscription may not be eligible for Visual Studio Online access
    Not all MSDN subscription types include VSO as a benefit. Please refer to the list of eligible MSDN subscriptions to verify.

     

    2. The MSDN subscription was assigned to the user within the last 48 hours
    There is a delay between MSDN subscription assignments and when VSO will see it \ allow the user to log in. If the subscription was assigned less than 24 hours ago please wait 24-48 hours and have the user try again.

     

    3. The “Software Downloads” benefit has not been provided with the MSDN subscription
    VSO usage is tied to the “Software Downloads” benefit of an MSDN subscription. In order to access VSO when assigned the “Eligible MSDN Subscriber” license type, the user’s MSDN subscription must include this “Software Downloads” benefit.  Here’s an example of enabling that benefit via the Volume Licensing Service Center:

     

     

    4. The wrong email address has been added to the VSO Users Hub
    This one is important. When an MSDN subscription is assigned to a user they receive an email (usually to their work email address) asking them to associate their new MSDN subscription with a Microsoft account\email address (MSA). This could be the same address where they received the mail or the user could pick a completely different MSA. The MSA they pick to associate with their new MSDN subscription is the one that needs to be added to the VSO Users Hub (https://Contoso.visualstudio.com/_user) and assigned the “Eligible MSDN Subscriber” license. It’s the one they use to manage their MSDN subscription via https://msdn.microsoft.com/subscriptions/manage.

    Check with your user to see what MSA they’ve associated with & use to manage their MSDN subscription, and ensure that’s the one on the Users Hub.

     

    If all this checks out, have the team member try this:

    1. Go to msdn.microsoft.com/en-us/subscriptions/manage and log in with your Microsoft account
    2. From the “My Subscriptions” section in the top-left, copy the name, email and Subscriber ID for the MSDN subscription you want to use with VSO. Paste this into Notepad or other text editor.
    3. Click “Add an existing subscription to my account” in the “My Subscriptions” section in the top-left.
    4. Fill out the form with the values you copied in step # 3 and then click NEXT.
    5. Acknowledge and accept the license notice and then click ACCEPT.

    This will not change your subscription in any way but will essentially reactivate it and with any luck this will allow you to log in to VSO.

     

    I hope you are not having any issues with the “Eligible MSDN Subscriber” license type in VSO, but if you are please run through this checklist to try and fix them. If you are still blocked, please open support case and we can assist:

     

    New “Filter My ListView” SharePoint Web Part and App now available for SP 2010 & 2013 On-premise and Office 365!!

    What is it?

    The “Filter My ListView” Web Part / App is a SharePoint WebPart enables you to create custom filter to find information in SharePoint list or document library.

    my listview

    Why do you need it?

    In working with SharePoint and with large lists or document libraries containing 100K+ items, users frequently found that there is no usable tool for filtering data.

    SharePoint let us create views, but their functionality doesn’t meet the requirements of users. And most popular reason is this: list view is static and users can’t modify it on the fly.

    On the other hand the “Filter My List” web part may filter data representing in the current view’s columns. But user can’t apply multiple filter to list and others (date range, filter criteria, …).

    All this leads to the fact that we have to have custom solution this solving these limitations.

    Usage

    The “Filter My ListView” Web Part / App is a simple to use SharePoint list view filter. It enables your to create custom filter form, composed from all list fields (not only fields containing in current list view).

    Supported field types

    • Simple text

    jQuery UI is used for using autocomplete!

    • Text with options enables select filtering type

    Text with filtering options

    • Date

    • DateRange

    • Boolean

    • DropDown list represents unique values of field

    • User or Group
    • Taxonomy Term Picker

    • Multi-select CheckBoxList

    The “Filter My ListView” Web Part / App builds a filter form using different types of controls:

    • TextBox. “Contains” criteria filter
    • TextBox with autocomplete
    • TextBox with options. Allows user to choose filter criteria that can be one of these:
      • Equals
      • Not equals
      • Contains
      • Begins with
    • Date
    • Date Range
    • DropDownList
    • DropDownList with multiple selection
    • People picker
    • MetaData picker

    Relation between field type and supported filter types is represented in this matrix:

    Contact me now through my blog, https://sharepointsamurai.wordpress.com or at tomas.floyd@outlook.com for this and more SharePoint and Office 365 custom developed Web Parts and Apps

    Example of how to use the SAP NetWeaver Gateway in building a Cloud App

    Overview

    SAP provides a tool called SAP NetWeaver Gateway that enables the ability to expose SAP application data as an OData service. This OData service can then be used by a CBA to create custom line of business apps. SAP has several sample gateway services you can use for testing and app building. For our example, we will use the SAP Enterprise Procurement Model (EPM) service. Read the SAP documentation to learn how to access to the EPM service and other sample services from SAP. Be aware that these sample services are read-only; however, NetWeaver Gateway does support read-write services.

    Our SAP CBA app will be based on a fictional company that sells computers and accessories. This company has several locations worldwide, including a distribution branch that we will be building a line of business app for, named Contoso Shipping Management. Specifically, our app will help the branch manager of Contoso Shipping Management with their daily tasks. The branch manager routinely views product information in the system and adds supplemental production information that is specific to their branch (such as the item location and whether items are out of stock).

    Define the data model

    Begin by creating a CBA app; in Visual Studio. Choose the Cloud Business App project template under the Office/SharePoint>Apps node.

    Attach to SAP Data Source

    When you have created the app, attach it to the SAP service.

    1. In the Server Explorer, under the Server project, choose the Data Sources>Add Data Source.
    2. In the Attach Data Source Wizard, notice the option to select SAP as a data source; after selecting it, choose Next.
      clip_image001 Figure 1. Select SAP in the Attach Data Source Wizard
    3. On the Enter Connection Information page, enter the URL to SAP EPM service along with the credentials that you received after signing up for access to their test feeds; choose Next. Although, it is possible to select None for the authentication type, typically SAP feeds are configured to require authentication (CBA apps currently support connecting to SAP using basic authentication). For more information, see the Authentication section listed at the end of this post.
      clip_image003 Figure 2. Enter connection information in the Attach Data Source Wizard
    4. On the Choose your Entities page, select the BusinessPartner and Product entities and rename the data source to SAP_EPM_Service; choose Finish.
      clip_image005 Figure 3. Select the BusinessPartner and Product entities in the Attach Data Source Wizard

    As a result, you will now see the SAP_EPM_Service added as a data source to your Server project including the BusinessPartner and Product entities that you selected.

    clip_image007 Figure 4. Entity Designer showing the Product entity

    You can use the ctrl + up arrow\down arrow keys to change the order of the properties. It is useful to define the desired order on the entity, so that later when screens are created, the fields on the screen will automatically be added in the same order. For example, you may change the order of the properties so that ProductId, Name, ProductURL, and Description appear first.

    One feature of an SAP data source within a CBA is the recognition of certain SAP-specific annotations that can adorn entity properties within the service. Specifically, the annotations that will be recognized by a CBA are those that have the sap:semantics value set to “email”, “tel”, or “url”. The BusinessPartner entity that was selected in the Attach Data Source Wizard happens to have properties that demonstrate all three of these annotations. You can view the semantic annotations by viewing the $metadata from the SAP feed.

    https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV/$metadata

    clip_image008

    Viewing the BusinessPartner entity in the Entity Designer, observe that the EmailAddress, PhoneNumber, and WebAddress fields have their respective types set to the Email Address, Phone Number, and Web Address business types.

    clip_image010 Figure 5. Properties on the BusinessPartner entity have been set to the appropriate business type

    Extend the Product Entity Properties

    For our line of business app, we need to track some additional product information that is specific to our branch, Contoso Shipping Management. With a CBA, we can easily extend any entity properties by relating data from the internal database of the app with data from an external data source. Furthermore, CBAs support relating data between external data sources, such as SharePoint\Office 365 and SAP.

    Add a Relationship

    In our example, we would like to track two additional pieces of product information: the item location and whether a product is out of stock. First, we need to add an entity to the internal database of our app that will be used to store this additional information. Second, we will relate this entity to the Product entity (that exists in the SAP data source) using a one-to-one or zero-to-one relationship.

    1. To add an entity in the internal database, choose the Data Sources node in the Solution Explorer and select Add Table.
    2. Rename the table to ProductDetail and add the following properties: OutOfStock and BackroomLocation. Clear the Required check box for these properties.
      clip_image011 Figure 6. The ProductDetail entity
    3. Add a relationship between Product and ProductDetail. To do this, open Product in the entity designer and choose the Add: Relationship button.
      clip_image013 Figure 7. Adding a relationship
    4. In the Add New Relationship dialog box, add a relationship so that each Product can have one ProductDetail (and a ProductDetail must have a Product). Choose OK to close the dialog box.
      clip_image014 Figure 8. Configuring the relationship

    Create the client screens

    Now that the data model is defined, add some screens to the app. While working with the screens, remember that the sample SAP service that we are using is read-only. As a result, the only data that we can edit is the ProductDetail entity because it is stored in the internal database of the app. If instead we were using a read-write SAP service, we would be able to edit all of the information on these screens and automatically save the data back to SAP.

    Create the Common Screen Set

    1. Choose Screens node in the Solution Explorer and select Add Screen.
    2. In the Add New Screen dialog box, select the Common Screen Set template and set the Screen Data to SAP_EPM_Service.ProductCollection. Finally, choose OK to close the dialog box.
      clip_image016 Figure 9. Adding a new Common Screen Set

      As a result, you will now see a ProductCollection folder created in the SolutionExplorer that contains three screens: AddEditProduct, BrowseProductCollection, and ViewProduct.

    3. Since we defined a one-to-one or zero-to-one relationship between Product and ProductDetail, we need to add code that automatically creates a new ProductDetail entity when the AddEditProduct screen is opened. Choose the AddEditProduct screen from the Solution Explorer, choose the Write Code drop-down in the Screen Designer toolbar and choose created.
      clip_image018 Figure 10. Writing “created” code on the AddEditProduct screen

      To create a ProductDetail instance for this Product instance, use the following code.

      myapp.AddEditProduct.created = function (screen) {
        if (!screen.Product.ProductDetail) {
          var productDetail =        myapp.activeDataWorkspace.ApplicationData.ProductDetails.addNew();
          productDetail.Product = screen.Product;
        }
      };

      Now when the AddEditProduct screen is opened, a related ProductDetail is created, if one does not already exist.

    4. To make the fields that were added from the ProductDetail entity more prevalent on the screen, move the Out of Stock and Backroom Location controls to the top of the right Rows Layout group on this screen. Do the same on the ViewProduct screen as well. Notice that you can also change other appearance properties on controls, such as the font. There are many other properties that you can set on the screen designer to customize the screens.
    5. Also on the ViewProduct screen, drag out the Supplier field under our ProductDetail controls to show data from the BusinessPartner entity. This will create a group called Supplier (with properties from the related BusinessPartner entity). For this example, remove all the fields from this group except Email Address, Phone Number, and Web Address. Notice that these controls appear respectively as an Email Viewer, Phone Viewer, and Web Address Viewer (due to the annotations feature described earlier).
      clip_image020 Figure 11. Control layout
    6. Finally, we want to display the Product images on the ViewProduct screen. First change the Product Pic Url control to be an Image control. To do this, choose the ViewProduct screen in the Solution Explorer, then find the Product Pic Url control on the screen and change it from a Text control to an Image control.
      clip_image022 Figure 12. Changing Product Pic Url to an Image control

      Because this SAP service stores the image URLs in a relative format, we need to write more code to set the full URL to the image. With the Product Pic Url control still selected, choose the Edit PostRender Code link in the Properties window.

      clip_image024 Figure 13. Properties of the Product Pic Url control

      Add the following code to the PostRender method.

      myapp.ViewProduct.ProductPicUrl_postRender =  function (element, contentItem) {
        // add the URL of our SAP server to the relative ProductPicUrl
        var totalUri = "https://sapes1.sapdevcenter.com" + contentItem.value;
        $(element).find("img").attr("src", totalUri);
      };

    Run the app

    Now, run the app (press F5).

    If prompted, enter your SharePoint credentials. When the app starts, also choose Trust It (if prompted).

    Notice the following:

    • The app home screen allows you to browse Product data from the attached SAP data source.
    • When you choose a Product, the detailed Product information is displayed—this includes fields from both the attached SAP data source and the fields defined on the intrinsic ProductDetails entity.
      clip_image026 Figure 14. The ViewProduct screen
    • On the ViewProduct screen, you can choose the Edit button to open the AddEditProduct screen. Since the particular SAP service that the app is accessing is a read-only service, the fields defined on the SAP Product entity cannot be updated, but those defined on ProductDetail can be. If your app is accessing a read-only service, it is a good idea to remove the Add button on the ViewProduct screen and make the controls on the AddEditProduct screen “view” controls instead of “edit” controls.
      clip_image028 Figure 15. The AddEditProduct screen

    Additional notes

    Authentication

    SAP can be configured with a variety of authentication providers. For this release, we support HTTP Basic authentication. Basic authentication is enabled on most NetWeaver Gateway installations, and is easy to configure in both test and production. If you find that CBAs aren’t working with your SAP environment, let us know how we can support you better in the future.

    For our debut of SAP support, we wanted to enable a complete read and write scenario with SAP data. Therefore, in addition to basic authentication support, we’ve also implemented the session and CSRF token handling that SAP requires to be able to modify SAP data via the NetWeaver Gateway. This means that your CBAs will be able to write changes back to SAP if your SAP feeds support it. Don’t worry—when you attach to SAP data, we negotiate the SAP sessions and tokens automatically. There is nothing for you to configure.

    Non-addressable entities

    If your data fails to load and you see a diagnostics error similar to the following, it’s because you’re attempting to navigate directly to an SAP entity that is non-addressable.

    Error: The SalesOrderLineItemCollection is not addressable. Please use the Navigation Property via the SalesOrder Collection or Entity.

    For example, in the EPM service, the SalesOrderLineItemCollection entity set is marked as non-addressable which is evident in the service root document (for example, https://sapes1.sapdevcenter.com/sap/opu/odata/sap/ZGWSAMPLE_SRV).

    clip_image030

    This means that SalesOrderLineItemCollection is a child of SalesOrderCollection and that you can only access it by navigating to it through the parent.

    To solve this, be sure that you do not have a Browse Data Screen that is bound directly to a non-addressable entity (for example, SalesOrderLineItem). Instead, bind the Browse Data Screen to the parent (for example, SalesOrder) and create a View Details Screen that displays the SalesOrderLineItems data for the selected SalesOrder. This is done for you if you use the Common Screen Set template and select the parent as the primary entity for the Browse Data Screen.

    Free integration guide -Microsoft Dynamics CRM Online and Office 365

    Combining the online services of Office 365 with Microsoft Dynamics CRM Online empowers your teams to work where and when they want with best-of-class cloud services.

    This guide is intended for Microsoft Dynamics CRM administrators and technical decision makers interested in exploring Office 365 services and how they integrate with Microsoft Dynamics CRM Online. Integration with Office 365 becomes increasingly relevant to

    Microsoft Dynamics CRM Online users as management of Microsoft Dynamics CRM Online shifts to the Microsoft online services environment.

    For a .pdf version of this document: Integration Guide: Microsoft Dynamics CRM Online and Office 365 please visit – http://download.microsoft.com/download/D/4/F/D4F5A3C3-E3CB-48C9-85DE-4ED0B7FFBD60/CRMO365Integration.pdf

    Some of what this paper covers:

    • Add an Office 365 trial subscription to Microsoft Dynamics CRM Online
    • Set up CRM Online to use Exchange Online
    • Set up CRM Online to use SharePoint Online
    • Set up CRM Online to use Lync Online
    • Set up CRM Online to  use Yammer

    New Office 365 Tool available to help you re-design for the App Model

    Learn about a tool that analyzes your SharePoint full-trust code solutions and Office add-ins and macros to help you redesign them for the app model. Security is important to us—your code remains private while using the tool.

    The app model is a great tool that fully embraces the benefits of moving to the cloud, but migrating to the model can be a time-consuming task. SharePoint is a complex enterprise-level collaboration system, and custom solutions built on top of the SharePoint platform using full trust code don’t easily map to a cloud-based deployment. Similarly, Office client solutions – managed add-ins and VBA macros built on individual client object models – are widely deployed on desktops and need to be ported to work in the cloud. We understand that creating these solutions required a significant investment. We want to help you translate these solutions to cloud-friendly apps as painlessly as possible.

    The SharePoint and VBA Code Analyzer—a tool to help you understand how you can refactor your SharePoint and Office client solutions to Office 365. Working with Mobilize.net, one of our long-standing partners, we’ve created a web portal where you can upload your SharePoint and Office client solutions and get a complete analysis of the existing code. We’ll provide guidance and recommendations on what level of effort needs to be invested to move them to the cloud, so you can start refactoring your custom business solutions as soon as possible.

    “But, wait!” you think. “I can’t send my company’s code where external parties look at it!” No worries—we have put several security measures in place to prevent unauthorized access, and the code runs through a completely black-box process. The analysis is done with automated tools which only collect metadata about files, lines of code, ASP.NET application pages, web parts, libraries, workflows, and other platform-dependent objects. We then use this data to generate reports on how you can map your existing code to the new model.

    The tool is also hosted behind a digital certificate-enabled site, which ensures that everything that goes across the wire to our black box process is encrypted.

    Brand NEW “My Latest Documents” SharePoint Web Part and App released and available!!

    In each SharePoint Team site where we have multiple document libraries, the requirement is always there to see what the latest changes are. Unfortunately the Microsoft web part allows only seeing the documents changed by myself.

    To be able to have a solution for that, where you haven’t to be administrator or owner, I created a definition for a recent document web part. This can be deployed on Office 365, SharePoint 2010 and SharePoint 2007 sites. On SharePoint 2007 and SharePoint 2010 the only use right needed, is to be able to modify the site.

    The goal of the web part was:

    • Show the 10 latest changed documents
    • Show a more button that displays additional 40 documents
    • Display the online status of users
    • Display the correct date format of each site
    • Display the name of the folder where the document is stored and a link to the folder.
    • Get documents recursively from all sub sites

    Example Image:

    The following instructions explain in detail how you can activate it:

     

    Activation SharePoint 2010

    1. Edit your webpage and add a new web part
    2. Select browse and upload a the webpart definition

    3. Click Upload

    4. Now, it’s a bit confusing, but you have to click again add new web part
    5. The upload web part is now available in your web part menu and you can add it.

    All this steps have to be done each time when you want to add the web part. To provide it for all site owners add it to the web part gallery.

    Activation on Office 365

    That means you have to upload it to your web part gallery:

    After uploading the web part is available on your site.

    You can simple edit the site, and click More Web Parts

    Afterwardy you can find it in the Default Web Parts folder.

     

    Contact me NOW at tomas.floyd@outlook.com to order this brand new Web Part and/or App

    Create a new Search Service Application in SharePoint 2013 using PowerShell

    The search architecture in SharePoint 2013 has changed quite a bit when compared to SharePoint 2010. In fact the Search Service in SharePoint 2013 is completely overhauled. It is a combination of FAST Search and SharePoint Search components.

    apxvsdik

    As you can see the query and crawl topologies are merged into a single topology, simply called “Search topology”. Provisioning of the search service application creates 4 databases:

    • SP2013_Enterprise_Search – This is a search administration database. It contains configuration and topology information
    • SP2013_Enterprise_Search_AnalyticsReportingStore – This database stores the result of usage analysis
    • SP2013_Enterprise_Search_CrawlStore – The crawl database contains detailed tracking and historical information about crawled items
    • SP2013_Enterprise_Search_LinksStore – Stores the information extracted by the content processing component and also stores click-through information

    # Create a new Search Service Application in SharePoint 2013

    Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue

    Settings    $IndexLocation = “C:\Data\Search15Index” #Location must be empty, will be deleted during the process!     $SearchAppPoolName = “Search App Pool”     $SearchAppPoolAccountName = “Contoso\administrator”     $SearchServerName = (Get-ChildItem env:computername).value     $SearchServiceName = “Search15”     $SearchServiceProxyName = “Search15 Proxy”     $DatabaseName = “Search15_ADminDB”     Write-Host -ForegroundColor Yellow “Checking if Search Application Pool exists”     $SPAppPool = Get-SPServiceApplicationPool -Identity $SearchAppPoolName -ErrorAction SilentlyContinue

    if (!$SPAppPool)    {         Write-Host -ForegroundColor Green “Creating Search Application Pool”         $spAppPool = New-SPServiceApplicationPool -Name $SearchAppPoolName -Account $SearchAppPoolAccountName -Verbose     }

    Start Services search service instance    Write-host “Start Search Service instances….”     Start-SPEnterpriseSearchServiceInstance $SearchServerName -ErrorAction SilentlyContinue     Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $SearchServerName -ErrorAction SilentlyContinue

    Write-Host -ForegroundColor Yellow “Checking if Search Service Application exists”    $ServiceApplication = Get-SPEnterpriseSearchServiceApplication -Identity $SearchServiceName -ErrorAction SilentlyContinue

    if (!$ServiceApplication)    {         Write-Host -ForegroundColor Green “Creating Search Service Application”         $ServiceApplication = New-SPEnterpriseSearchServiceApplication -Partitioned -Name $SearchServiceName -ApplicationPool $spAppPool.Name  -DatabaseName $DatabaseName     }

    Write-Host -ForegroundColor Yellow “Checking if Search Service Application Proxy exists”    $Proxy = Get-SPEnterpriseSearchServiceApplicationProxy -Identity $SearchServiceProxyName -ErrorAction SilentlyContinue

    if (!$Proxy)    {         Write-Host -ForegroundColor Green “Creating Search Service Application Proxy”         New-SPEnterpriseSearchServiceApplicationProxy -Partitioned -Name $SearchServiceProxyName -SearchApplication $ServiceApplication     }

    $ServiceApplication.ActiveTopology     Write-Host $ServiceApplication.ActiveTopology

    Clone the default Topology (which is empty) and create a new one and then activate it    Write-Host “Configuring Search Component Topology….”     $clone = $ServiceApplication.ActiveTopology.Clone()     $SSI = Get-SPEnterpriseSearchServiceInstance -local     New-SPEnterpriseSearchAdminComponent –SearchTopology $clone -SearchServiceInstance $SSI     New-SPEnterpriseSearchContentProcessingComponent –SearchTopology $clone -SearchServiceInstance $SSI     New-SPEnterpriseSearchAnalyticsProcessingComponent –SearchTopology $clone -SearchServiceInstance $SSI     New-SPEnterpriseSearchCrawlComponent –SearchTopology $clone -SearchServiceInstance $SSI

    Remove-Item -Recurse -Force -LiteralPath $IndexLocation -ErrorAction SilentlyContinue    mkdir -Path $IndexLocation -Force

    New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $SSI -RootDirectory $IndexLocation    New-SPEnterpriseSearchQueryProcessingComponent –SearchTopology $clone -SearchServiceInstance $SSI     $clone.Activate()

    Write-host “Your search service application $SearchServiceName is now ready”

    Update

    To configure failover server(s) for Search DBs, use the following PowerShell:

    Thanks to Marcel Jeanneau for sharing this!

    #Admin Database   $ssa = Get-SPEnterpriseSearchServiceApplication “Search Service Application”    Set-SPEnterpriseSearchServiceApplication –Identity $ssa –FailoverDatabaseServer <failoverserveralias\instance>

    #Crawl Database   $CrawlDatabase0 = ([array]($ssa | Get-SPEnterpriseSearchCrawlDatabase))[0]    Set-SPEnterpriseSearchCrawlDatabase -Identity $CrawlDatabase0 -SearchApplication $ssa -FailoverDatabaseServer <failoverserveralias\instance>

    #Links Database   $LinksDatabase0 = ([array]($ssa | Get-SPEnterpriseSearchLinksDatabase))[0]     Set-SPEnterpriseSearchLinksDatabase -Identity $LinksDatabase0 -SearchApplication $ssa -FailoverDatabaseServer <failoverserveralias\instance>

    #Analytics database   $AnalyticsDB = Get-SPDatabase –Identity     $AnalyticsDB.AddFailOverInstance(“failover alias\instance”)    $AnalyticsDB.Update()

    You can always change the default content access account using the following command:

    $password = Read-Host –AsSecureString**********Set-SPEnterpriseSearchService -id “SSA name” –DefaultContentAccessAccountName Contoso\account –DefaultContentAccessAccountPassword $password

    Look out for my Powershell Web Part and Google Analytics Web Part and App that is under Development and available soon for purchase!!

    Image

    Must have Tool for BizTalk : Introduction to BizTalk 360

    BizTalk 360 was announced for public technology preview yesterday, you can read more about it here http://www.biztalk360.com

    Why BizTalk 360?

    There is one common problem across all the BizTalk customers. i.e there is no proper support tool for BizTalk. It’s a reality people are more passionate and interested in designing, architecting and developing the software and not enough attention is given to the afterlife of that application once it reaches production. It’s very important to focus on a production application, which represents customers business and credibility. BizTalk 360 is all about managing that application that’s in production. There are some of common problems most of the BizTalk server customers face, BizTalk 360 tries to address them

    Introduction

    BizTalk 360 is a web based rich internet application built using Microsoft Silverlight and WCF. Web based application means, you can centrally deploy it and no necessity to install all the BizTalk administration components on individual support persons desktop. Also, it comes with rich authorization module, which allows to give fine grained authorization to support people. No more remote desktop (RDP) access to all production servers, restrict users to only few BizTalk applications, or even provide only read-only access in the environment. The choice is yours.

    Let’s see some brief introduction to some of the key modules of BizTalk 360

    BizTalk Environment Dashboard

    env-dash-org-small

    This is the home screen of the application, as soon as you access an environment, you are presented with the environment dashboard showing various part of the system and their health status.It shows the number of applications, number of hosts, number of message boxes etc. in the environment. The green and red represents their respective health status, red is error, green is healthy. If there are any suspended instances in the environment, a red bar will appear with the count of total suspended instances and the last suspended date time.

    BizTalk Application Dashboard

    application-dash-org-small

    BizTalk application dashboard provides a single view into a particular BizTalk applications. The artifact headings are colour coded to represent their overall health. If there are suspended instances it will be highlighted on the dashboard.

    Diagrammatic representation of Send Port

    send-port-picture-org-small

    A picture is worth more than 1000 words, BizTalk 360 come with diagrammatic representation of send ports (only send ports for V1, road map to include other artifacts like orchestrations, receive ports etc)

    Artifact Properties

    image

    BizTalk 360 allows to view the properties of all the BizTalk artifacts like send port, receive port, receive location, map, schema, application, pipeline, etc

    Governance/Auditing

    SNAGHTML1a6f95a4

    SNAGHTML1a7040e1

    Lack of tooling around “who did what” in the system is the number one challenge. In a production application it’s vital to log this information for various reason. BizTalk 360 handles it seamlessly and provides views out of the box to see this information in your fingertips.

    User Access Policy

    If you are running one of your “AAA” rated application on BizTalk server, you want to control the user access in a fine grained way. You don’t want one of your junior analyst supporting the application to have administrator rights on production environment. You want to control things like

    • Giving only read-only access to certain people,
    • Blocking certain areas of the application like, not allowing user to suspend/terminate instances,
    • Restricting users to only few BizTalk applications etc.
    • Restricting users from starting/stopping application artifacts like (send port, receive location, orchestration) and host instances etc.

    BizTalk 360 takes care of all these requirements seamlessly with a full admin panel to control it, and also audit it.

    Topology

    topology-multi-small

    BizTalk server is targeted for enterprise customers, so a typical BizTalk environment will have at least 4 servers (2 BizTalk, 2 SQL) in most of the cases
    to support high availability. It’s vital to know your topology any time without digging through out of sync Visio documents. BizTalk 360 provides a graphical view of your topology dynamically generated from your environment.

    Advanced Event Viewer

    aev-org

    One of the other problem application support people face maintaining BizTalk production environment is probing through various event logs across multiple servers in the group to diagnose problems. This is both time consuming and also a security risk allowing support people to have full access to the servers. BizTalk 360 tackles this issue by providing a centralized event viewer functionality. It understand the topology of the environment, pulls all the event log information and presents it in a central place. The query builder is powerful as shown in the above picture.

    Business Activity Monitoring portal

    image

    BizTalk 360 also comes with a simple BAM portal, which allows users to get to BAM data from a single UI. Users can search for activities, see permission and activities time window. BizTalk 360 utilizes the security mechanism provided by BizTalk BAM infrastructure, hence this will nicely complement your existing BAM investment.

    Host Container View

    image

    BizTalk host is a logical container for various BizTalk artifacts. When it comes to scalability, administrator normally create multiple host and host instances and deploy them in various BizTalk servers.  BizTalk 360 provides a Host Dashboard, which allow users to see what’s running inside the host (host instance) at any point in time. From the above picture you can see it list the orchestrations, send port and receive ports that run inside a particular host.

     

    BizTalk Server Characteristics

    image

    In a typical production BizTalk environment you’ll have 2 or more servers performing various activities like sending, receive, processing and tracking based on your requirements.

    BizTalk 360′s BizTalk Server dashboard provides the characteristics of that server and it clearly shows “how the server is been utilized”. Apart from this the server dash board shows various other things like host instances running in the server, event viewer data, whether its web server etc.

     

    Advanced Windows User/Role based Authorization

    Define your own NT roles and dictate how user can access the environment.

     
    Restrict users/groups to limited applications

    Restrict users/groups to limited applications

    You may want to restrict users to certain application(s) in a shared environment.

     
    Restricted view for certain users/groups

    Restricted view for certain users/groups

    Make the environment look as simple as what’s shown here. Just couple of applications with Topology. It’s fully customizable.

     
    Read-Only Access

    Read-Only Access

    Note there are no buttons to start/stop.. anything in the Send port

     
    Super User View

    Super User View

    Of course super users and user with correct rights can start/stop…

     
    Restriction to resume/terminate instances

    Restriction to resume/terminate instances

    You don’t want all of your support staff to resume/terminate instances. What happens if someone termintated that $1million worth message.

     
    Restricted access to messages

    Restricted access to messages

    Are you dealing with confidential messages (Health record, multi million $$ deals etc), you don’t want all your support people to view it right. Then please restrict it!

     

    Query Service Instances

     
    msg

    Rich Query Builder

    Build your complex queries using the user friendly query builder.

     
    adba_v

    Query Results (with KB)

    See the service instance details with the ability to resume/terminate the instance. A knowledgebase article can be attached any service instance with error code.

     

    adba_v

    Service Instance Details

    Service Detail window will show the complete details of the service instance like error information, messages referenced etc.

     
     
     
    Prepackaged Queries

    Prepackaged Queries

    BizTalk360 comes with a set of useful admin queries out of the box. We will be adding more queries in the upcoming releases.

     
    Add/Modify/View/Delete Custom SQL Queries

    Add/Modify/View/Delete Custom SQL Queries

    Administrators normally keep a bunch of SQL queries in their toolbelt for some of their regular operations. With BizTalk360 they are managed completely within the UI. Only SELECT statements are allowed; the system will reject any other statements like INSERT or UPDATE

     
    Execute SQL Queries within the UI

    Execute SQL Queries within the UI

    Users don’t need to have access to external SQL tools like SQL Management Studio to run the custom queries. They can execute and see results directly from within the UI. This avoids giving rights for the users to various SQL servers.

     
    Permission

    Permission

    Super-user at the flick of a button can either grant or revoke permissions to manage / execute custom SQL queries for users.

       
     

    Contrary to popular beliefs – The SAME skills are used to develop SharePoint On-premise and SharePoint Online Apps

    Taking an on-premise application and deploying it on a Windows Azure Virtual Machine should be straight forward. The majority of modifications required, include changing configurations in order to accommodate differences in the Virtual Machine’s configuration.

    Going to the cloud on a single Virtual Machine is like crossing the ocean on a row boat. You’ll probably get there, but I can’t guarantee anything.

    row boat

    Once you’ve deployed your application to the cloud, things get interesting because you have opportunities that allow you to solidify your application.

    If you are building your application on the Cloud instead of building your application for the Cloud, you’re doing it wrong!

    Taking advantage of services offered on Windows Azure allows you concentrate on creating value for your business. The Windows Azure teams takes care of lots of boring stuff like disaster recovery. To me, build applications for the cloud is equivalent of going from a row boat to a fleet of navy destroyers. (Alright, I may be a little overconfident, but you get the picture.)

    going to war

    Preparing applications to go web scale isn’t a trivial task. Each application is unique and requires different services. You will need to go over your objectives and build accordingly.

    Although the planning phase isn’t trivial, augmenting your applications with cloud services isn’t as complicated as some might think. The Windows Azure teams provide an amazing amount of support materials like hands on labs, tutorials and a rich collection of documentation.

    Architectural Considerations

    Throughout the planning phase of an application on the cloud there are a few architectural strategies that require some extra attention.

    Cloud Design Patterns

    While designing or augmenting cloud based applications, the following patterns should be consider. While this list isn’t complete, it should be used as a starting point.

    For more patterns, take a look at the resources listed at the bottom of this post.

    • Cache-aside Pattern – This is a common technique that we can use to improve the performance and scalability of a cloud solution by temporarily copying frequently accessed data to fast storage located close to the application.
    • Queue-based Load Leveling Pattern – Cloud solutions are submitted to very unpredictable loads and require protection against their own success. By placing queues between clients and the workers who execute tasks, you are protecting yourself against spikes.
    • Competing Consumers Pattern – Enable multiple Cloud Service instances to retrieve messages from the same source.
    • Compute Resource Consolidation Pattern – Consolidate multiple tasks or operations into a single computational unit (Roles, Virtual Machines, Web Sites).
    • Eventual Consistency – Cloud solutions use data that’s dispersed and duplicated across data stores, managing and maintaining data consistency can become a major bottleneck.
    • Leader Election Pattern – A great way to coordinate actions being performed by a group of Cloud Service instances is to elect a leader that can act as the coordinator. This is extremely useful for maintenance tasks and singleton tasks that need fallbacks.
    • Materialized View Pattern – This has got to be one of my favorite cloud patterns. The solutions’ data may not be formatted in a way that favors our query requirements. In order to optimize our queries, it may be desirable to generate pre-populated views whose shapes correspond with our requirements.
    • Pipes and Filters Pattern – We should strive to decompose complex tasks into a series of discrete elements that can be reused.

    Succeeding on the cloud is all about architecture, using the right service for the right reasons. This can be a challenge because cloud platforms are continuously evolving. Windows Azure is currently (Jan 2014) on a 3 week release cycle and it can be quite a challenge for all of us to keep up. Fortunately there are blogs, podcasts and online courses that help us along the way.

    Learn More

    My Subscriptions Alert – New SharePoint 2013 Web Part available

    This web part shows the current logged in user what lists or libraries the user had subscribed for it.

    It will show gold bell icon beside the list name; which means you subscribed for this list.

    If there is silver colour bell beside the list name; this means you didn’t subscribe to this list. To subscribe, you can click the list name, it will popup the “New Alert” SharePoint OOB model dialog.

    Also in this dialog you will have many options of when to receive alerts and on what changes exactly.

    In the Tool Part of the Web Part, user can select the lists on current site that the user have permission to only to see it displayed in the web part to subscribe to it.

    Below is screen shot:

    Contact me at tomas.floyd@outlook.com for more information on this and other custom developed SharePoint Web Parts and Apps

    Gulp.js : A Better Alternative to Grunt.js

    Getting Started with Gulp.js

    Like Grunt.js, Gulp.js is also provides as a NPM module so that you can install it through NPM. In order to working Gulp.js, you need to install Gulp globally and locally for your project.

    The command below install gulp globally on your machine.

    npm install –g gulp

    You also need to install gulp locally for your project. If you haven’t a package.json file for your project, just create it.

    The command below install gulp locally and will automatically add dependency into the devDependencies of package.json.

    npm install --save-dev gulp

    Now we have installed Gulp.js. The next step is to add a Gulp file named “gulpfile.js” for automating our tasks for the project. Let’s add a gulpfile.js file. The code block below shows a basic gulpfile.js file where we are just specifying a task named “build” by using the Node.js module gulp.

    var gulp = require('gulp');
    
    gulp.task('build', function () {
    });

    Automating Tasks with Gulp.js

    There are lot of useful plugins are available for Gulp.js, which can be used for building most of the commonly used task automations. When I write this post, there are 310 gulp plugins available for automating your tasks. All plugins can install by using npm.

    Minifying JavaScript Code

    Let’s automate our tasks in the AngularJS SPA reference app webapi-angularjs-spa. Let’s start our automation tasks with minifying our JavaScript code files. For this task, we are using the following gulp plugins:

    • gulp-uglify
    • gulp-concat
    • gulp-size

    The above plugins are using for minifying, concatenating minified files and getting size of the minified file respectively.

    Let’s install these npm modules.

    npm install --save-dev gulp-uglify gulp-concat gulp-size

    The code block below shows a task named “app-js-minify” which minifies all JavaScript files written in AngularJS.

    var filePath = {
        appjsminify: { src: './app/**/*.js', dest: './app' },
        libsjsminify: { src: ['libs/**/*.js',
     '!*.min.js', '!/**/*.min.js'], dest: 'libs/' },
        jshint: { src: './app/**/*.js' },
        minifycss: { src: ['../Content/themes/**/*.css',
     '!*.min.css', '!/**/*.min.css'],
      dest: '../Content/themes/' }
    };
    
    gulp.task('app-js-minify', function () {
        gulp.src(filePath.appjsminify.src)
            .pipe(uglify())
            .pipe(concat('ngscripts.js'))
            .pipe(size())
            .pipe(gulp.dest(filePath.appjsminify.dest));
    });

    In the above code block, we are specifying the code files to be minified by using the src method of gulp. We use gulp’s method .pipe() to pipe the source files into a gulp plugin. The concat function bundles all minified files to a file named “ngscripts.js” into a destination folder that specified using the function dest. The size returns the size of the minified file “ngscripts.js”. We have specified all files paths in a variable filepath. By default, tasks in Gulp.js, will be executed asynchronously.

    Let’s run our first task “app-js-minify” by run the command “gulp app-js-minify” as shown in below figure.

    image

    Automating Unit Tests with Mocha

    Let’s automate our unit tests with Gulp. We use the gulp plugin “gulp-mocha” for automating unit tests with Mocha.

    gulp.task('mocha', function () {
        gulp.src('./test/*.js')
            .pipe(mocha({ reporter: 'list' }));
    });

    Minifying CSS

    We use gulp plugingulp-minify-css” for minifying CSS files. In this task we also use a gulp plugin “gulp-rename” for rename the file.

    gulp.task('minify-css', function () {    
        gulp.src(filePath.minifycss.src)
        .pipe(minifycss())
        .pipe(rename({ suffix: '.min' }))
        .pipe(gulp.dest(filePath.minifycss.dest));
    });

    The below gulpfile is taken from our AngularJS SPA demo app.

    var gulp = require('gulp'),
        gutil = require('gulp-util'),
        uglify = require('gulp-uglify'),
        jshint = require('gulp-jshint'),
        concat = require('gulp-concat'),
        jshintreporter = require('jshint-stylish'),
        minifycss = require('gulp-minify-css'),
        size = require('gulp-size'),
        clean = require('gulp-clean'),
        rename = require('gulp-rename');
    
    var filePath = {
        appjsminify: { src: './app/**/*.js',
     dest: './app' },
        libsjsminify: { src: ['libs/**/*.js', '!*.min.js',
     '!/**/*.min.js'], dest: 'libs/' },
        jshint: { src: './app/**/*.js' },
        minifycss: { src: ['../Content/themes
        /**/*.css',
     '!*.min.css', '!/**/*.min.css'], 
    dest: '../Content/themes/' }
    };
    
    gulp.task('app-js-minify', function () {
        gulp.src(filePath.appjsminify.src)
            .pipe(uglify())
            .pipe(concat('ngscripts.js'))
            .pipe(size())
            .pipe(gulp.dest(filePath.appjsminify.dest));
    });
    
    gulp.task('libs-js-minify', function () {
        /*Excludes already minified files.*/
        gulp.src(filePath.libsjsminify.src)
        .pipe(uglify())
        .pipe(rename({ suffix: '.min' }))
        .pipe(gulp.dest(filePath.libsjsminify.dest));
    });
    
    gulp.task('jshint', function () {
        gulp.src(filePath.jshint.src)
          .pipe(jshint())
          .pipe(jshint.reporter(jshintreporter));
    });
    
    gulp.task('minify-css', function () {
        /*Excludes already minified files.*/
        gulp.src(filePath.minifycss.src)
        .pipe(minifycss())
        .pipe(rename({ suffix: '.min' }))
        .pipe(gulp.dest(filePath.minifycss.dest));
    });
    
    gulp.task('clean', function () {
        gulp.src(
            [
                'app/ngscripts.js',
                'libs/angular-ui/select2.min.js',
                'libs/select2/select2.min.js',
                'libs/semantic/semantic.min.js',
                'libs/jquery-1.9.1.min.js',
                '../Content/themes/semantic/
                  semantic.min.css',
                '../Content/themes/Site.min.css',
                '../Content/themes/select2/
                select2.min.css'
            ], { read: false })
        .pipe(clean({force:true}));
    });
    gulp.task('build', ['app-js-minify', 'libs-js-minify', 'minify-css']);
    gulp.task('cleanbuild', ['clean']);

    You can run the gulp task by specifying the task name. In the above gulpfile, we are specifying two tasks at the end of the section named “build” and “cleanbuild” where we are specifying logically related group of tasks. You can pass these tasks as an array of defined tasks.

    gulp.task('build', ['app-js-minify', 'libs-js-minify',
     'minify-css']);
    gulp.task('cleanbuild', ['clean']);

    When we run “gulp build” it will run tasks “app-js-minify”, “libs-js-minify” and “minify-css”. When we run “gulp cleanbuild” it will run the task “clean”. If you are simply running gulp without specifying any task, it will look for the task with name “default”.

    The code block below provides the package.json file of the project

    {
      "name": "webapi-angularjs-spa",
      "description": "SPA Demo app with AngularJS",
      "author": "Shiju Varghese",
      "version": "0.5.0",
      "repository": 
          "https://github.com/MarlabsInc/webapi-angularjs-spa",
      "dependencies": {},
      "devDependencies": {
        "gulp-util": "~2.2.14",
        "gulp": "~3.5.2",
        "gulp-concat": "~2.1.7",
        "gulp-uglify": "~0.2.1",
        "jshint": "~2.4.3",
        "gulp-jshint": "~1.4.2",
        "jshint-stylish": "~0.1.5",
        "gulp-minify-css": "~0.3.0",
        "gulp-rename": "~1.1.0",
        "gulp-size": "~0.1.3",
        "gulp-clean": "~0.2.4"
      }
    }

    Source Code

    I have implemented the task automation in our AngularJS demo app which is available on github at https://github.com/MarlabsInc/webapi-angularjs-spa. The JavaScript client app is available from the location here. The gulpfile is available from here.

    Building a Cloud Business App: Kudos

    Office 365 is an ideal business app platform providing a core set of services expected in today’s business apps and a central location for installing, discovering, and managing apps. Office 365 makes these business apps available where users already spend their time – in SharePoint and Office.

    Visual Studio 2013 streamlines modern business app development for Office 365 and SharePoint 2013 with the Cloud Business App project. This walkthrough will show how you can build social, touch-centric, cross-platform Office 365 business applications that run well on modern devices.

    What we’re going to build

    In our scenario, let’s say my organization is on Office 365. The company encourages cross-team collaboration and would like to build an app that allows employees to send kudos to fellow employees.

    An employee can find the app on SharePoint. He or she can launch the app on desktop browsers or different mobile devices. The app allows a user to send kudos to a coworker and shows a list of kudos users sent and received.

    Figure 1. The kudos app we will be building in this walkthrough
    Figure 1. The kudos app we will be building in this walkthrough

    Let’s build this app with Cloud Business App!

    Create a Cloud Business App project

    Let’s create a Cloud Business App project. We are creating an app for Office 365, so you can find the Cloud Business App template under Office/SharePoint for both VB and C#. This categorization is based on the language used in the middle tier; the client is HTML and JavaScript.

    Let’s name the project KudosApp and choose OK.

    Figure 2. Create a new Cloud Business App project in Visual Studio     Figure 2. Create a new Cloud Business App project in Visual Studio

    You first need an Office 365 developer site to start building apps for Office/SharePoint. If you don’t have an account for development, you can sign up for free 30-day trial at dev.office.com. If you are a MSDN subscriber, you will receive the subscription as a benefit.

    Enter your SharePoint development site here and choose OK.

    Figure 3. Enter your Office 365 developer site     Figure 3. Enter your Office 365 developer site

    Once created, you will find a Cloud Business App is comprised of four projects in the solution:

    1. Server project, which is a basic ASP.NET project used to add tables and connect to data sources
    2. Standard SharePoint project, which provides a connection to Office 365
    3. HTML client project, a JavaScript project in which you define the UI for your app
    4. A Cloud Business App project, which ties all the other projects together
    Figure 4. Solution Explorer     Figure 4. Solution Explorer

    Define data

    Let’s start by defining the data model for our app. In Cloud Business App, you can create new tables or attach to external data sources such as SQL, Odata, and SharePoint assets. In our scenario, we send and receive kudos, so let’s create a table for kudos. Choose Create new table.

    Figure 5. Add a new table in the Table Design     Figure 5. Add a new table in the Table Designer

    Name the table Kudos and add two fields:

    • KudosTo (Person)
    • Message (String)

    The Table Designer provides a set of business types, such as PhoneNumber, Email, and Person. They include specific validation logic and visualizations both in the tooling and runtime.

    Figure 6. Add some fields in the Table Designer     Figure 6. Add some fields in the Table Designer

    There is a growing trend in integrating social features into modern business applications. Cloud Business App makes it easy by integrating with the SharePoint Newsfeed feature.

    With the title of the Kudos table selected, you can enable social under the Social category in Properties window. Select Post when Created. When a kudos is created, the app will post the activity to Newsfeed.

    Figure 7. Enable Social feature in your a     Figure 7. Enable Social feature in your app

    Create queries

    In our app, we want to show kudos sent by me, as well as the kudos I received. We can create two queries for these. Choose Add Query button in the tool bar of the Table Designer.

    Figure 8. Choose "Query" button to add a custom query for this table     Figure 8. Choose “Query” button to add a custom query for this table

    In Query Designer, name the query KudosSent. We want the query to return all kudos created by me, so let’s filter it by setting CreatedBy equals to Current User. Let’s also sort it by the Created field.

    Figure 9. Customize the query with the Query Designer     Figure 9. Customize the query with the Query Designer

    We will create another query via the context menu of the Kudos table in Solution Explorer.

    Figure 10. Add another query via the context menu     Figure 10. Add another query via the context menu

    This time, we will name the query KudosReceived and filter by setting KudosTo equals Current User.

    Figure 11. Customize another query     Figure 11. Customize another query

    Create a browse screen

    Now that we’ve defined the data model, let’s design the UI for the app. Create a screen via the context menu on Screens node in Solution Explorer.

    Figure 12. Add a screen via the context menu     Figure 12. Add a screen via the context menu

    The Add New Screen dialog box will appear. Cloud Business App provides three screen templates that represent common UI patterns for browsing, viewing, adding, and editing data. Let’s start with a browse screen that shows all kudos sent by me.

    Select Browse Data Screen, name the screen WelcomeToKudos and select KudosSent query as the screen data. Choose OK.

    Figure 13. Create a screen by choosing a screen template     Figure 13. Create a screen by choosing a screen template

    A screen is created for you. In the Screen Designer, you see a Screen Content Tree in the middle that represents the visual elements in the UI. Visual elements are bound to a data on the Data Members List on the left.

    For example, in this screen, we have list visual showing values from the KudosSent data set.

    Figure 14. Your UI elements are laid out in the Screen Designer     Figure 14. Your UI elements are laid out in the Screen Designer

    We can also choose to render the data set as a Table or a Tile List visual. Let’s use Tile List.

    Figure 15. Change the visual control     Figure 15. Change the visual control

    The node under the Tile List indicates what fields will show up in a tile.

    Figure 16. We will display kudos as a tile list     Figure 16. We will display kudos as a tile list

    Since this is a list of kudos sent by me, let’s delete the Created By field. We will also delete the ModifiedBy and Modified fields.

    Figure 17. Customize the tile list     Figure 17. Customize the tile list

    You may have noticed that Cloud Business App automatically created audit fields for you (Created, CreatedBy, Modified, and ModifiedBy). It is a common requirement in business apps, so the tool handles it for you (you can turn it off in the Table Designer).

    In the tile, the Kudos To field is rendered with a Person Viewer control. We can customize what will show up in the Person Viewer via the Properties window. Change the Display Mode to Name with picture and title.

    Figure 18. Customize the look-and-feel of a visual control     Figure 18. Customize the look-and-feel of a visual control

    Let’s also change the font and alignment of the Created field. Select Created. In Properties window, change Font Style to Small and Text Alignment to Right.

    Figure 19. Customize the font and alignment of a visual control     Figure 19. Customize the font and alignment of a visual control

    Create an add screen

    We have a list of kudos sent by me. Let’s create some UI to add kudos. In WelcomeToKudos screen, add a button in the Command Bar.

    Figure 20. Add a button to the screen     Figure 20. Add a button to the screen

    The Add Button dialog box will appear.

    Figure 21. Add Button dialog box     Figure 21. Add Button dialog box

    You can write your own method for this button using JavaScript code or, in our case, we can select from a set of commonly used features. In the Choose an existing method dropdown menu, select KudosSent.addAndEditNew.

    We are saying that, when the button is chosen, we will add a new record to the KudosSent data set via a new screen we are about to create. Choose OK.

    Figure 22. Choose an existing method     Figure 22. Choose an existing method

    The tool will guide us to create a new screen for adding a kudos. Name the screen SendKudos and choose OK.

    Figure 23. Create a screen to add a kudo     Figure 23. Create a screen to add a kudo

    A new screen (SendKudos) is now created.

    Figure 24. New screen created in the Screen Designer     Figure 24. New screen created in the Screen Designer

    Let’s check what we’ve got so far! Press F5 to run the app.

    Figure 25. Run the application     Figure 25. Run the application

    We have an empty list and an add button on the screen. Let’s add a kudos. Choose Add Kudos. The Send Kudos screen (rendered as a dialog box) will appear.

    Figure 26. UI to add a kudo     Figure 26. UI to add a kudo

    Note that all layouts adapt well to different form factors. Resizing the browser window gives you an idea of how the app looks on a phone or tablet. Everything is optimized for touch, but works equally well on a desktop browser using keyboard and mouse.

    Figure 27. App in a small form factor     Figure 27. App in a small form factor

    Customize the UI while running in the browser

    In the Send Kudos dialog box, Message is rendered as a text box. We want to change it to a text area. Also, since we only have two fields in the screen, we don’t need to show two columns in bigger form factors. For these types of UI tweaks on the screen, I can quickly make these changes without closing the browser and press F5 again.

    Go back to the designer (without closing the browser) and change the Message fields to use Text Area control.

    Figure 28. Customize the UI in Visual Studio while the app is running     Figure 28. Customize the UI in Visual Studio while the app is running

    Let’s also change the KudosTo display mode to show picture and title.

    Figure 29. Customize the visual control     Figure 29. Customize the visual control

    Now, let’s remove the columns. Drag Kudos To and Message out of the columns layout, then delete columns layout.

    Figure 30. Customize the UI layout     Figure 30. Customize the UI layout

    Choose Save All in the designer and refresh the browser. Choose Add Kudos again. All the UI changes are now reflected in the app. This provides an efficient iterative design experience.

    Let’s add a Kudos. The Kudos To value can be selected using an auto-complete text box based on Active Directory.

    Figure 31. Choose a person from the auto complete text box     Figure 31. Choose a person from the auto complete text box

    Choose Save and the newly added kudos will appear in the list.

    Figure 32. A kudo is created in the app     Figure 32. A kudo is created in the app

    Notice, you can see additional Office 365 integration here. When you hover your mouse over the person, it shows presence information. You can send an IM, e-mail, or schedule a meeting right here.

    Figure 33. Presence information inside of the tile     Figure 33. Presence information inside of the tile

    Create a screen tab

    Now we have a list of kudos sent by me. Let’s also add a list of kudos I received. We can show the two lists on the same screen using two different screen tabs.

    Close the browser and return to Visual Studio. Open the WelcomeToKudos screen. Notice our tile list is currently under a screen tab called Kudos List. By default, every screen has one screen tab. The tab UI will not show in the app unless you have more than one screen tab.

    Figure 34. By default, there is one screen tab in the screen     Figure 34. By default, there is one screen tab in the screen

    Let’s add another screen tab. Choose the Tabs node and select Add Tab.

    Figure 35. Add a new screen tab     Figure 35. Add a new screen tab

    A new screen tab is now added.

    Figure 36. New screen tab is created     Figure 36. New screen tab is created

    In Properties window, change the Display Name of first screen tab to Kudos Sent and the second screen tab to Kudos Received.

    Figure 37. Change the display name of the screen tabs     Figure 37. Change the display name of the screen tabs

    Add new data to the screen

    Now, we need to add the list of kudos I received under the newly created screen tab. Recall we created a KudosReceived query earlier. Let’s include that query on the screen. Choose Add Data Item button in the toolbar.

    Figure 38. Add a data member to the screen     Figure 38. Add a data member to the screen

    In the Add Data Item dialog box, select KudosReceived and choose OK.

    Figure 39. Select a data member to add to the screen     Figure 39. Select a data member to add to the screen

    The query now appears in the Data Members List.

    Figure 40. A new data member is added to the sc     Figure 40. A new data member is added to the screen

    Drag the query under the second screen tab on the Screen Content Tree.

    Figure 41. Create UI for the newly added data member     Figure 41. Create UI for the newly added data member

    Like we did with the first list, let’s change the Kudos Received list to a Tile List. Customize the tile to show only Created By, Message, and Created.

    Figure 42. Customize the tile list     Figure 42. Customize the tile list

    Press F5 again to see the changes. Now, there are two screen tabs on the screen.

    Figure 43. App displays 2 screen tabs     Figure 43. App displays 2 screen tabs

    If you send a kudos to yourself, you will see it in the second screen tab.

    Figure 44. Kudos Received tab shows all kudos created by the current user     Figure 44. Kudos Received tab shows all kudos created by the current user

    Remember when we created the Kudos table, we enabled the social feature. Now, if we open the Newsfeed page, we will see some posts by the app.

    Write business logic

    So far, we have a completely functional app now without writing a single line of code!

    Cloud Business App lets you focus your energy on the unique value of the app: the business logic. Let’s say we don’t want you to be able to send kudos to yourself and we want to write some validation logic for that.

    Open the Kudos table. Business logic is written on the middle tier, which is represented by the server project in your solution. The Table Designer provides you with entry points into the data pipeline of your app.

    Open the Write Code dropdown menu in the tool bar, you will find a list of code entry points for business logic. Choose KudosSet_Validate.

    Figure 45. Entry points for writing business logic     Figure 45. Entry points for writing business logic

    It will take you to the code entry point in the Code Editor.

    Figure 46. Write validation logic     Figure 46. Write validation logic

    Write the following code.

    if (entity.KudosTo == Application.Current.User.Email)
    {
      results.AddPropertyError("You cannot send kudos to yourself", 
      entity.Details.Properties.KudosTo);
    }

    Now, run the app and try to send a kudos to yourself. You will get the validation error.

    Figure 47. Validation logic is invoked in the running app   Figure 47. Validation logic is invoked in the running app

    Publish the app

    Finally, when I’m ready to publish this app to my organization, I can choose the Cloud Business App project and select Publish. I can follow the Publish Wizard to step through different deployment options.

    Figure 48. Publish your app   Figure 48. Publish your app

    The app is in the app catalog and can be installed on one or more sites for people to use.

    Figure 49. The app is published to the app catalog   Figure 49. The app is published to the app catalog

    Conclusion

    To summarize, you saw a highly productive experience for defining data and screens that enable you to quickly get an app up and running. The app has a professional looking UI that blends with SharePoint and is integrated with a set of Office 365 services. This allows you to focus on your business logic and get more done.

    To learn more about Cloud Business Apps, visit Apps for Office and SharePoint Dev Center and Cloud Business Apps on MSDN.

    FREE SharePoint App – Pictures gallery with cool JQuery animations and effects

    Project Description

    Galleriffic App is an app part for SharePoint 2013 to display a pictures gallery with cool JQuery animations and effects. This App is an open source tool distributed under MIT license by Olivier Carpentier and based on the excellent Galleriffic jquery extension by Trent Foley.

    App Screenshots

     Galleriffic App part sample :

    Administration page :

    Download it now :

    http://1drv.ms/1f1x4vJ

    Creating a Cloud Business App with a Social Newsfeed

    The Person business type is a feature of the new Cloud Business App project introduced in Visual Studio 2013. The Cloud Business App project streamlines the way you build modern business applications for Office 365 and SharePoint 2013. The Person business type makes it easy to add and manage people-related data in your application. In this post, we will show you how to use the Person business type and what it can do for you.

    If you’re new to Cloud Business Apps in Visual Studio 2013—read Andy’s post first: Building a Cloud Business App: Kudos

    Getting Started

    Business Types provide declarative formatting and validation over storage types which helps speed up development of business applications. For example, when designing entities, in addition to all the base storage types like String, Integer, Boolean, etc., properties can be of type Phone Number, Email Address, Web Address, Person, Money, Percent and these types come with built-in validation, formatting and controls. Let’s see how the Person business type works. Suppose you are building an application to track mobile devices that a developer organization is using for testing applications they write. A device can be either checked out to a specific person or be in storage. Here is what a Device entity might look like in the Data Designer.

    Figure 1. The Device Entity

    In this example, we set the Owner and CheckedOutTo properties to be of type Person. Person is a business type, a type of .NET Framework string. It is designed to store people identities: values that uniquely identify individuals. You can store any identity value that you want in a Person field, but we make some assumptions about the identity format. So, if you want the Person type to fully work and bring back rich information about a logged in user, you will want the identity to be the user’s primary email address. Here, we provide a simple API to get the identity value for the current user of the application. Use this API to get more information about a person represented by a given identity value. So, working with Person properties do not require you to handle the particulars of the different authentication mechanisms.

    Rich Information via Info Properties

    The entity class we generate from your data model includes two properties for each Person property: the property containing the raw identity (identity property of type string) and a property ending with an “Info” suffix of type PersonInfo (info property). In our example, the identity property “CheckedOutTo” has a corresponding info property named “CheckedOutToInfo”. Similarly, there is Owner-OwnerInfo property pair. The info property exposes information about a person identified by the corresponding identity property. Info properties are read-only; the data source is a directory service. You can use the info property value in your code to write business logic. For example, the following code shows how to use the entity Info property to send an email when a Device is saved.

    partial void Device_Updating(Device entity) {     if (entity.Details.Properties.CheckedOutTo.IsChanged         && !string.IsNullOrEmpty(entity.CheckedOutTo))     {         O365PersonInfo owner = entity.OwnerInfo;         O365PersonInfo currentUser = entity.CheckedOutToInfo;         if (string.IsNullOrEmpty(owner.FullName)             || string.IsNullOrEmpty(currentUser.FullName))         {             // We could not resove the owner or the current user of the device.             // Continue without sending email.             return;         }

    string emailBody = “Hey, just FYI, your device ” + entity.AssetNumber                             + ” (” + entity.Description + “) has been checked out to ”                             + entity.CheckedOutToInfo.FullName;

    SendEmail(“DeviceTracker@contoso.com“,                    entity.OwnerInfo.Email,                    “Device ” + entity.AssetNumber + ” checked out”,                    emailBody);     } }

    How to: Handle Data Events discusses adding code to the update pipeline. If the identity property contains a value that the directory does not recognize, the info property returns an object that represents the unresolved, raw identity and the full person information will be unavailable. In Visual Studio 2013, Cloud Business Apps use either Active Directory or Azure Active Directory (for on-premises vs. cloud-based applications, respectively) to retrieve contact, organizational and security-related information about people. If the application uses Windows or Forms authentication, only basic, security-related information is exposed via info properties.

    Current User Data

    Information about the current user of the application is available via the User property of the global Application object. In Visual Studio 2013, this property will return an object of PersonInfo type, so you can handle the “current” user and other users in the application in the same way as shown in the following code.

    partial void Devices_Inserting(Device entity) {     // If the Owner has not been set, assume the current user is the owner     if (string.IsNullOrEmpty(entity.Owner))     {         entity.Owner = Application.User.PersonId;     } }

    We use the PersonId property to retrieve the identity of the current user in a format that is appropriate for the authentication mechanism used by the application.

    Filtering Data for the Current User

    A common scenario in a business application is to filter data based on the current user. This would show the most relevant data for the user to consume. For example, in the device tracking application we want to have a screen that shows all the devices checked out to the current user. To support this case, two Global types are available in the query designer–Current User and Anonymous User. Current User refers to the user who is currently logged into the app. Anonymous User is one that the server is not able to authenticate. To use these Global types, use the Query Designer to add a query to the Device table. Add a filter on the query designer. Choose CheckedOutTo as the left operand. From the operator type dropdown, choose Global. Choose Current User as the right operand. The filter should be as shown in the following figure.

    Figure 2. Specifying a person filter in the Query Designer

    Now a screen that uses this query as a data source will display all devices that are checked out to the current user logged into the app. Notice that this filtering can only happen at the server side since the current user information is available only at the server.  How to: Design a Query by Using the Query Designer discusses building queries.

    Working with People Data

    There are a couple things worth remembering when working with people data and PersonInfo types. First, you can store any value in a Person property, including values that are not resolvable. These might be identities that have become invalid or valid identities that the directory service does not know how to resolve later. Before using any Info property, check whether it was properly resolved. The presence or absence of the FullName is a good way to check this (see the first code example). Second, it is not impossible for the same person to have multiple identities or for her identity (email address, login name) to change. It is even possible to have the same identity be assigned to a different person. The Person type does not have any specific support for these scenarios, so you must perform any checks or compensation logic. With unrestricted read/write access to the raw identity in the Person type, you can do what you want in your business logic.

    Person Viewer and Picker Controls

    Visual Studio 2013 introduces two new controls for the Person type—Person Viewer and Person Picker. Person Viewer is a read-only control that shows the user’s full name, title, picture, and Lync Presence Status. The Person Viewer control uses NameCtrl ActiveX to display Presence Status. The Lync Contact Card is shown on mouse hover on this control. Choose this control to navigate to the user’s SharePoint profile site.  The Person Picker control is an editable text control. This control is used to search for a person and select one from the results shown in the dropdown menu. To see how these controls work, let’s build some screens for the device tracker application. Let’s say we need screens that allow browsing through a list of devices available for the development team, view details of the selected device, and add or edit device details. To add a screen, navigate to Screens node located under the HTML client project in the Solution Explorer. Choose Add Screen… from the context menu. This will open the Add New Screen dialog box that has the necessary templates to simplify the creation of browse, view, and add/edit screens. To create a screen that can display a list of devices, choose the Browse Data Screen template, provide a name for this screen, and choose Devices entity for Screen Data as shown in the following figure.

    Figure 3. Adding a Browse Data Screen template

    Similarly, add a screen using the View Details Screen template to view details of a particular device and a screen using Add/Edit Details Screen to add a device or edit details of a device.  The browse screen will be the default home screen for the app. Navigation logic to View Details and Add/Edit screens need to be added in the Browse screen. For example, choosing a particular device listed in the Browse screen should open View Details screen. This logic can be designed with a few steps.  Open the Browse screen in Screen Designer. The designer layout will be as shown in the following figure.  

    Figure 4. Specifying a tap action for a list item in the Screen Designer

    Select Devices list in the content tree and open the Properties window. Notice that under Actions property sub group there is Item Tap property. This property indicates the action that will be performed when a user chooses an item in the list. The default value of this property is None, which indicates no action. Let’s set the tap action to navigate to View Details screen. Choose None to open Edit ItemTap Action window. Choose viewSelected from the dropdown menu under “Choose an existing method” choice as shown in the following figure.

    Figure 5. Selecting a tap action

    Another option for screen navigation is through Command Bar buttons. For example, let’s add a Command Bar button that when chosen will open Add/Edit Details screen. In the Browse screen, choose Command Bar node located in the screen content tree and choose Add. This opens the Add Button window, which looks similar to Edit ItemTap Action window. In this window, choose “addAndEditNew” from the dropdown menu under “Choose an existing method:” choice. Choose OK to close the window.  Notice that Add Device button is added under the Command Bar node in the screen designer. In the Properties window of this button, the Tap action will be set to a built-in method called “addAndEditNew”.

    The Picker or Viewer control is automatically chosen based on the type of a screen. For example, on a screen that uses the View Details Screen template, a Person property will use the Viewer control.  Whereas, the same property in a screen built with Add/Edit Details Screen template will use the Picker control. You can change this in the Screen Designer. The following figure shows View Detail screen.  Notice that Owner, Checked Out To, Created By, and Modified By are Person type properties and use the Person Viewer control.  

    Figure 6. The person type control in the Screen Designer

    The Person Viewer can display a person in two ways. The first option, which is the default, shows just the user name. The second options shows the title and picture in addition to the person’s name. This choice can be selected in the Properties window of the Person property as shown in the following figure.

    Figure 7. Specifying the display mode of the person type

    Similar to Person Viewer controls, Person Picker control’s search results can be customized to show just the user name or user name along with title and picture. Using the “Name only” option would only show full names in search results. Run the app (F5) and see how these controls work. The Browse screen will be the home page. Since this is a new app, there will be no devices listed. To add a new device, choose the Add Device button on the Command Bar. This will open up a dialog box to enter device details as shown in the following figure. Notice the Person Picker control shows all the matches for “Karol Z” among the SharePoint users.

    Figure 8. Selecting a person using the person picker control

    Now, the browse screen will have the new device listed. A tap or click on this device will show the device details as shown in the following figure. Notice the Person Viewer Control has name, title, and picture shown for “Owner” and “Checked Out To” properties. Presence Status information of these users is also available. A mouse hover on Owner property shows the user Lync Contact Card.

    Figure 9. Viewing person information using the person viewer control

    These controls behave in a certain way depending on the underlying components that are used to integrate SharePoint and Lync. Some of these behaviors include:

    • The Lync 2013 desktop application needs to be running in the background and you need to be logged in using the same credentials that were used to log into SharePoint.
    • If the app is not able to resolve the person correctly then the Viewer control will show the person ID in plain text. Reloading the app should allow it to re-query SharePoint and resolve the person correctly. If the problem persists, it is most likely due to SharePoint Cross Domain Library is unable to connect to SharePoint. One reason this could happen is the SharePoint site uses “https” and the app can only run “http” thus putting them in different protection levels.
    • NameCtrl is a 32-bit ActiveX control.  Such controls have some limitations:
      • These work only in 32-bit browsers (IE Desktop, Chrome Desktop, and Firefox Desktop).  There is a workaround to make these controls work in a 64-bit browser.
      • The control must be allowed to run in the browser
    • If you are using a SharePoint server on premises (not Office 365) then the user’s profile picture in Viewer control will not show. However, the Lync contact card will have the correct picture.

    New “Focus On…” Web Part Released!!

    The “Focus On…” Web Part selects a random entry from the specified Sharepoint Library and displays a picture, a title and an         abstract of the selected person or item.

    The Web Part can be used with Windows Sharepoint Services V3, MOSS 2007, Sharepoint 2010 and Sharepoint 2013.

    Feauture of the day 2
    You can configure the following web part properties:

    • the Sharepoint Library
    • the List fields corresponding to the picture, title, abstract and detail link
    • enable or suppress the “Details..” URL.
    • show a new entry every day or on every page refresh

    This allows you to display random data contained in any Sharepoint List by specifying the desired Sharepoint List name and the desired list column names.

    How to use :

    Create a new Sharepoint Picture Library if you do not intend to use an existing Picture Library.
    If you decide to create a new Sharepoint list to store the Spotlight entries, create a new Sharepoint Picture Library anywhere in the Sharepoint site collection (the web part is able to access any picture library within the site collection).
    The list needs the following columns to hold the entries:
    – Title
    – Abstract
    – optional Detail Link URL

    Focus on 2

    Configure the following relevant Web Part properties in the Web Part Editor “Miscellaneous” pane section as needed:

    • Site Name: Enter the name of the site that contains the Spotlight Picture Library:
      – leave this field empty if the Library is in the current site (eg. the Web Part is placed in the same site)
      – Enter a “/” character if the Library is contained in the top site
      – Enter a path if the Library is in a subsite of the current site (eg. in the form of “current site/subsite”)
    • List Name: Enter the desired Sharepoint Picture Library
    • View Name: Optionally enter the desired List View of the list specified above. A List View allows you to specify specific data filtering and sorting.
      Leave this field empty if you want to use the List default view.
    • Title Field Name: Enter the desired Library Column name that contains the titles (Default=”Title”)
    • Abstract Field Name: Enter the desired Library Column name that contains the abstracts (Default=”Abstract”)You can alternatively specify a “Field Template” by entering the desired Library fields (surrounded by curly braces). You can specify HTML tags and CSS styles to freely format the text.

    Focus on 4

    • Example:
      <strong>{JobTitle}</strong>
      <br>{Description}

      5px; margin-top:5px; background-color:orange”>


      <strong>Schools:</strong><br>
      {Bio}
      </div>

      The above example assumes that the Sharepoint Library includes a “JobTitle”, a “Description” and a “Bio” column.

    • Details URL Field Name: (optional) Enter the desired Library Column name that contains the Detail page links (Default=”DetailURL”). Leave this field empty if you don’t want to provide a detail link.
      If you want to automatically link to the corresponding Sharepoint List Detail View page, enter the keyword “DetailView” into this field.
      If you want to automatically link to the corresponding user’s “MySite” page, enter the keyword “MySite” into this field.
    • Open Details Link in new window: opens the link in a new browser window.
    • Details Caption: allows to localize the “Details..” link displayed in the lower right part of the web part (if a “Details” link is specified).
    • Text Layout: specify the placing of the Text with respect to the Image:
      – Right
      – Wrap
      – Bottom
      – Left
      – WrapLeft
    • Image Height: specify the image height in pixels. Enter “0” if you want to use the default picture size.
    • Default User Image: (optional) specify a default user picture (if there is no user picture available) by entering a relative URL to the imageExample:
      /yoursite/yourPictureLibrary/yourDefaultUser.jpg
    • Title CSS Style: enter optional CSS styles to format the Title (default: bold)
    • Text CSS Style:  enter optional CSS styles to format the Body Text (default: none)
    • Background Color (optional):  To set the desired Web Part Background color, enter either a HTML color name (as eg. “yellow”) or a hexadecimal RGB color value (as eg. “#ffcc33”). Leave this field empty if you don’t want to use a specific background color.
    • Show new Entry: shows a new entry depending on the below setting:
      – always (a new entry is displayed on every page refresh)
      – every Day
      – every Week
      – every Month
      – top Entry (the most recently added entry unless a View is used with a specific custom sorting order)
    • Show specific Entry: optionally enter the List ID of the List Item to be displayed.
    • Nbr. of Items to show: optionally enter two ore more items to be displayed side by side:

    Focus on 6

    • Center Web Part: horizontally centers the Web Part within the available space.

    Focus on 5

    Contact me now for this awesome Web Part!

    Cloud Tasks – New Office365 & Cloud App developed – Available now!

    Cloud Tasks App is a task management app that lets user efficiently manage all his active tasks in the Cloud.

    Most of the organizations now uses both Microsoft Dynamics CRM and Microsoft SharePoint. They are looking for solutions that integrate these two technologies.

    And with Microsoft Office 365, Microsoft Dynamics CRM Online and SharePoint Online pair together which results in cloud productivity. 

    Cloud Tasks App is one such cloud based solution that lets the Office 365 users that are active users of CRM solution as well as SharePoint portal in same office 365 subscription, effectively manage their tasks.

    Both Microsoft Dynamics CRM and Microsoft SharePoint uses concept of Tasks, however there is no single interface wherein the end users can manage all their tasks. User has to go to CRM or to SharePoint portal and click on individual Tasks to get the details and work on it. 

    Cloud Tasks App is a SharePoint 2013 App that provides user efficient way of managing all his tasks at a single place.

    Cloud Tasks App aggregates all the active tasks (in both CRM and SharePoint portal) and presents it in an easy to use Calendar interface. This gives user a clear pictorial view of the approaching deadlines, which would be difficult if he has to get details of his tasks separately in CRM solution and SharePoint portal. 

    By providing a single intuitive interface and facility to perform various actions on the tasks, Cloud Tasks App helps user to plan and organize his tasks efficiently and increases his productivity.

    Publishing apps for Office and SharePoint to Windows Azure Websites

    This post will focus on provider-hosted apps for SharePoint and apps for Office. Provider-hosted (as opposed to SharePoint-hosted or Autohosted) means that the developer is responsible for hosting the web content – which is precisely where Azure Websites can help. At the end of the post, I will also look at advanced topics, including options for publishing to a non-Azure environment (like an on-premise server).

    Direct web publishing to Azure

    Creating a profile

    Suppose you have an app for SharePoint or an app for Office that you’re ready to publish for the first time. To begin publishing your app, choose the app for SharePoint or app for Office project, and choose “Publish”.

    Figure 1. Publish menu in Solution Explorer
    Figure 1. Publish menu in Solution Explorer

    A guided publishing experience will appear, as shown below.

    Figure 2. Guided publishing experience
    Figure 2. Guided publishing experience

    For a new project, there is no current publishing profile. You can create one by selecting <New…> from the profile dropdown, which will open the following dialog box.

    Figure 3. Creating a new publishing profile
    Figure 3. Creating a new publishing profile

    If you’re publishing to Azure, choose the “download your publishing profile” link, and you’ll be redirected to the Azure portal. There, if you have not already, you can create a new website by choosing the +NEW at bottom-left corner of the portal. The bottom portion of the screen will expand, allowing you to create a new website via the Quick Create or Custom Create menu items.

    Figure 4. Creating new website on Azure portal
    Figure 4. Creating new website on Azure portal

    Once the website entity is created, choose it from the list of websites to reveal the website details. Then choose Download the publish profile and save the profile to your computer. The profile contains all the information necessary to deploy your web content, including any auxiliary information like linked database connections.

    Figure 5. Downloading the publish profile from the Azure portal
    Figure 5. Downloading the publish profile from the Azure portal

    Back in the Visual Studio dialog box, and with the import publishing profile option still selected, choose the “browse” button and browse to the newly-downloaded file. Depending on the type of app:

    • For apps for Office, the profile is now complete.
    • For apps for SharePoint, you can now configure the Client ID and Client Secret on the second page of the wizard. These values uniquely identify your app to SharePoint, and allows the app to access SharePoint data. Client IDs and Secrets are generated and registered automatically when you debug your app, but they must be registered in a more permanent fashion when publishing the app. To do so:

    At the completion of either registration process, you will be granted a Client ID and Client Secret. Transfer those values into the Profile-creation dialog, and then choose Finish.

    Deploying and packaging

    Once the profile is set, the publishing buttons will activate. You now have a choice of deploying the web project and/or packaging the app. When publishing for the first time, you will need to do both – but it generally makes sense to start with deploying the web project first.

    Deploy your web project

    Deploying your web project is exactly what it sounds like: it will publish the entire contents of your web project (but not the SharePoint app or Office manifest) to the web. To do this, choose deploy your web project and you will see the familiar web publishing experience – complete with Preview, deployment settings, and more. The Connection tab has been pre-filled with information from your publish profile, and you can go to Settings to customize the publish configuration and options like Remove additional files at destination. Note that if your project requires a database, you can set it on the Settings page of the Wizard – and that, if your publish profile came from Azure, you can simply choose the database from the dropdown list.

    Figure 6. Deploying a web project to Azure
    Figure 6. Deploying a web project to Azure

    The Preview functionality is helpful to ensure you’re publishing the right set of files. By choosing a file in the Preview list, you can see the impending changes that you’re about to commit to your live site.

    Figure 7. Preview functionality in Visual Studio
    Figure 7. Preview functionality in Visual Studio

    Packaging your app

    Once the web project is deployed, packaging the app is designed to be simple, and most fields should be pre-populated. If you used a publishing profile, the URL will already be pre-populated, though you’ll need to change the URL from “http” to “https”. Note that with Azure Websites, https hosting is automatically included for any website hosted on *.azurewebsites.net (for custom domains or other hosting providers, you may need to follow additional steps).

    For apps for Office, that’s all you need: Just choose Finish, and a manifest file that points to your live web content will get generated for you. For apps that you wish to sell on the Office Store, see the next section. Otherwise, if it’s just an in-house app, you can upload the app to a file share or to a corporate catalog.

    For apps for SharePoint, you will need to provide or confirm the Client ID, which you may have already entered during the profile-creation step. After that, click “finish” – and an app package will get created for you. Again, see the next section for apps headed for the Office Store. Otherwise, if you only intend to distribute the app to users of your SharePoint site, follow the documentation for uploading the app to a SharePoint corporate catalog.

    Publishing to the Office Store

    After your app package (apps for SharePoint) or manifest file (apps for Office) is created, you can use the Visit the Seller Dashboard button to get started with publishing to the Office Store.

    For apps for Office, you can also run your app through a validation utility, which will catch common mistakes (like not specifying required information in the manifest). This will save you time and hassle when submitting the app to the Store.

    Figure 8. Validation utility for apps for Office
    Figure 8. Validation utility for apps for Office

    Upgrading an app

    When it comes time to upgrade an existing app that you have already published, what steps do you need to take?

    For both apps for Office and apps for SharePoint, if all that you’ve updated is just in the web project, you can just re-publish the web content via the Deploy your web project button. These changes will go live immediately, and you’re fully in control of deploying these whenever you’d like.

    If you made changes to the app manifest (apps for Office and apps for SharePoint) or if you have modified any SharePoint artifacts (lists, event receivers, or anything outside the web project), you need to re-publish those artifacts instead via the Package the app button. If your app is listed on the Office Store, you will then need to re-submit to the produced app package or app manifest to the Store, so it may take a few days before those changes go live – and remember that applying an update is at the discretion of the user.

    In general, remember to be considerate of the upgrade impact when modifying anything outside of the web project. Especially for apps for SharePoint, which have a more involved upgrade process, see the Apps for SharePoint update process article for an in-depth upgrade discussion and for guidance on how to avoid breaking older app packages when deploying new web artifacts.

    Advanced topics

    Specifying multiple publish environments

    One common request we heard is to publish an app to different environments. For example, one might want to publish to a “staging” environment first, ensure that the app works properly, and only then publish to “production”.

    With the new Publish experience, switching between multiple environments is only a dropdown away. Each publish profile remembers its own URL, Client ID, and Secret, so publishing to a different profile is as easy as changing the profile dropdown and choosing the appropriate “Deploy your web project” and “Package the app” buttons.

    Figure 9. Publishing to multiple environments
    Figure 9. Publishing to multiple environments

    Configuring Client Secret (or other environmental variables) in the Azure Portal

    Sometimes, the “Client Secret” for the production app might be a closely-guarded secret. As a developer, you might have the ability to publish to the website, but you might not have access to the Client Secret itself. The same thing might be true for any other such variables.

    One way to solve this scenario is to have your Azure account Admin manage these environmental variables through the Azure Portal. For each Azure Website, it is possible to have the Client Secret – or any other variables – be set via “app settings” section of the Configure tab. The “app settings” values take precedence over values in Web.config, so you get the best of both worlds – your local F5 scenario continues to work as before, yet your published app can make of a Client Secret that you might not even have access to.

    Figure 10. Configuring a Client Secret in the Azure portal
    Figure 10. Configuring a Client Secret in the Azure portal

    Deploying outside of Azure (or to a local IIS server)

    If you need to deploy to a non-Azure hosting provider – particularly if you are publishing to an on-premises machine – you can still use the many improvements to the app-publishing experience.

    During profile-creation, choose the Create new profile radio button.

    Then, once you are ready to deploy your web content, enter the Connection credentials in the “Publish Web” wizard.

    The rest of the flow should be the same. Remember to ensure that your hosting server supports the HTTPS protocol.

    Creating a Web Deploy Package

    An alternate, but similar, case for publishing to a local IIS server is when only an IT admin has the ability to publish a website. To simplify deployment, you can provide the IT admin with a Web Deploy Package – a .zip file that contains all web artifacts.

    To do this, create a new profile rather than importing one from Azure. In the case of an app for SharePoint, you will also need to fill in some dummy Client ID and Secret values.

    Now go to Deploy your web project – but be sure to choose Web Deploy Package as the publish method in the “Connection” tab.

    Figure 11. Creating a web deploy package
    Figure 11. Creating a web deploy package

    Choose a package location (any new folder will do) and proceed with the wizard. At the end, a set of deploy scripts and a .zip file with your web content will be generated.

    Figure 12. Web deploy package files
    Figure 12. Web deploy package files

    Your IT Admin should be able to take things from here (registering the app with SharePoint and providing the Client ID and Secret into the deployment scripts). Once the web content is deployed, ask your admin to provide you with the Client ID (the secret is not needed) and proceed with the “Package the app” step. You can then send the app package – now containing the SharePoint artifacts, and pointing at the live web content – back to the IT admin to deploy to SharePoint.

    Enhanced by Zemanta

    SharePoint 2013 App Available -Retrieves cross-domain data by using JSONP

    This is a SharePoint-hosted app for SharePoint that uses JSON with Padding (JSONP) to retrieve data from a proxy page on a Windows Azure Web Site. The sample contains two solutions, one for the App and one for its use on a SharePoint page.

    The App deploys the JSONPClient  App part for SharePoint.

    You add the app part to a page and then enter the URL of the proxy page and the URL of a feed. The proxy page gets data from the feed that’s specified in the app part, and returns the data in JavaScript Object Notation (JSON) format.

    The app part gets the feed data from the proxy page by using JSONP, and then displays the data.

    Figure 1. The JSONPClient SharePoint App displays data from the specified feed on the page/s it is placed

     

    SP15_app_JsonpAppPart (2)

     

    Please contact me for this or any other SharePoint and Office365 Web Parts / Apps :

    On-Premise and for the Cloud and Azure

    Developing for the future – How to write code in VS 2010 for Web Parts, that are compatible with the App model of SharePoint 2013

    The sample demonstrates how to develop code that works in SharePoint 2010 and also as a SharePoint 2013 App. The goal is to show you how to develop a SharePoint web part and event receiver that can be packaged as traditional solutions or as apps. Whether you’re ready for the new App model or not, it’s not too early to start developing in a new way that works on premises or online, today or tomorrow.

    This sample focuses on a Provider-hosted app that runs in an external ASP.NET site – and that can be packaged
    to run in SharePoint 2010 as a Visual Web Part and Event Receiver as well as in SharePoint 2013 as an app.
    •This posting is the SharePoint 2010 Solution
    •The posting you are viewing now is the SharePoint 2013 app

    Whether packaged as a SharePoint solution or app, the sample assists users in locating and creating SharePoint sites. It begins by displaying a list of child sites, and then can present a form that allows the user to create a new child site using a web template.

    Building the Sample

    There are two related samples. MSDN Code Gallery would not allow posting them together because it only allows one posting in each language, and thinks the samples are written in C# due to the Visual Studio project type. (In reality, it’s a blend of C# and Javascript!) The Location Mapping Solution requires a SharePoint 2010 development machine using Visual Studio 2012. The App requires a SharePoint 2013 development machine using Visual Studio 2012.

    Description

    A detailed artcile explaining this sample is available at http://blogs.msdn.com/b/bobgerman/archive/2013/10/08/future-proof-solutions-part-2-sharepoint-2010-solutions-that-become-provider-hosted-apps.aspx.

    SharePoint Samurai