Friday, July 11, 2014

SharePoint Online Tidbits

I've been working on a project of late that is using SharePoint Online 2013 as part of the solution.  It was not a platform I chose but one that already existed and I was asked to continue building on.   For those who might be evaluating a cloud SharePoint solution or those who are just starting to work in it these are my tips, tricks, and tidbits.  Please keep in mind this post is specifically about SharePoint Online 2013 and that on premise SharePoint 2013 has differences and limitations that can be overcome.

1. Be aware of the threshold limit.  You can't change this online (you can on premise).  If you have any type of data request that has to deal with more than 5,000 records it won't work.

- If you have 5001 items or more in your "All Items" view; it won't work.
- If you need to search that field by a single key with more than 5000 results; it won't work.
- Same deal with an API call.

Having said that... Yes... you can have lists with over 5,000 items and still work with them.  The big key to this is multiple indexed fields.  You can build views that use those multiple indexes and CAML queries via the API as well.  A client has 1.5 million rows in several SP Online lists that they're querying in this manner.  Do I recommend it?  No.  But it is possible.

2. Service durability.  I'm not sure how they have the Office 365 cloud setup but our instance constantly has problems.  Some days I can't login to the Office 365 admin site.  Some days users can't save pages with content web parts.  Some days something is in extended recovery and other things are slow or quirky.  It has caused issues especially at inopportune times.

3. DEV/QA/Prod Environments.  Out of the box this concept really doesn't exist.  You can't really bundle up a site and deploy it to another site.  There are really 2 ways to try and do this.  One is saving a list or page as a template (with or without content).  It works ok in the same site or in a subsite of the site.  You can export large amounts of data to Excel and import them into other sites but it doesn't preserve the meta data.  By that I mean Site A might be a drop-down list but Site B that drop-down will be a text box. Not to mention intenal field name issues (see below). You can export an entire site as a template too but it has a 50mb limit.

Keep in mind that SharePoint isn't really a traditional IT application.  In that I mean you don't have deliverables (EXEs, DLLs, SQL scripts, etc) that you can deploy to various environments.  It is really a publishing model.  You checkout a page, you save the changes, and when you're ready for it to go "live" you check it in.

When you want to have a QA team test your solution/changes end-to-end you have a lot of work to build a suitable QA site or even a DEV site.  I've done it but it is tedious.

4. Scripting is your friend.  Angular, KnockOut, JQuery, or plain old JavaScript.  Drop them into a Script Editor Web Part and fire away.  We used BxSlider, SimpleModal, KnockOut, the SharePoint JS, etc with no issues.  We're also looking at using Bootstrap for responsive UI.

SharePoint Online does NOT want you running resource intensive things on their cloud solution so the more things you push to scripting the happier everyone will be with this solution.  See 9/10 for SharePoint/Offsite App tidbits.

5. Internal field names can be an issue.  Say I have Site A and I created a List called MyList.  And in my list I have firstnnname and Last Name.  I realize after I created the list that I misspelled first name and didn't follow the same convention (spacing and case).  I rename "firstnname" to "First Name".  Everything will seem OK until you use an API.

When you use the API two things happen to be aware of...

First, you'll notice in an API call that there is no "First Name" or "Last Name" field!  First Name will appear in the XML as firstnnname.  "Last Name" will appear as Last_x0020_Name.

SharePoint will use whatever the original name of the field was.  So often times a list will have a "Title" field which will always be Title in the API even if you rename it.  And spaces will be replaced by "_x0020_" because is how SP handles spaces.

Oh hey... remember #3 when I talked about Excel exports?  That's the second thing.  If you exported this hypothetical MyList via Excel there is no meta data.  So when you call the API it WON'T be firstnnname!  It will be First_x0020_Name because that is what SharePoint is going to put into the column header which the Import will use to create the column name.

6. The API is useful but it has limitations.  I used the SharePoint library and Knockout JS (KO) to query lists and bind them to KO templates.  It allowed me to do basically an MVC-eqsue solution inside of SharePoint.  However, the API can't get AD groups and such.  It can pull attributes but not groups.  My workaround was to create a Script Editor Web Part with a target (advanced -> target audience) of an AD Group that set a variable in JavaScript that I could use to determine which user group had logged in.  We only had 1 group that had different content but you could get very complex very quickly with this approach.

I'll also add that luckily only the pieces applicable to the user are rendered so even if you have a bunch of stuff on a page if you're using Target Audiences the intended audience will only get the content rendered for them.

7. Targeting individual list items.  You can do this with a Content Query Web Part (CQWB).  It is smart enough if you go into a list and enable target audiences to return only the items that the logged in user has access to.  You CAN'T add a List Web Part that has target audiences set on list items and expect it to work.  It will show ALL list items.

I found the CQWP limiting because it offered you 4 fields (title, description, link url, and image url [names might be slightly different]) and basically looked like search results.  Apparently you can use CSR (Client-Side Rendering) to doll these up any way you want but my solution didn't do that because I was doing back-end coding and we didn't have a UI/UX resource and didn't know once we got one if he or she would be able to do this.

8. CSS.  Good and bad SharePoint has branding/master page/master template type functionality.  It is great if you want a consistent look and feel but my team found early on that when we were redoing parts of a site that SharePoint was trumping our CSS.  The hack and slash solution is to specify a style element on page elements and style away.  The better approach is to make sure your CSS you are referencing has greater specificity than SharePoint.  To that end anything we wanted to style had a class name and we styled the class not the HTML elements themselves.

9. Custom SharePoint Apps are a mixed bag.  They have some limitations which you can see on the link provided below.  I also find them to be a little slow to load and respond.  It also stinks that an app will be unavailable while a new build is uploaded.  I can edit and save a page in SharePoint with the published version alive and well all day long.

http://msdn.microsoft.com/en-us/library/ff798382.aspx  

10. Cross domain is an option.  You can setup a relying party and do stuff with SharePoint Online on premise and IFrame in content.  A vendor ran into issues with token lifetime which required you to re-authenticate with ADFS inside the frame.

Sunday, March 30, 2014

IIS and .NET

A few things that came up during a deployment that I thought would be worth noting for anyone migrating to a new server...

1. Make sure that the full .NET install is performed.  There is a "client" profile which doesn't include everything and some things are in a different assembly than the "full" framework.  You need to specifically target the client framework in your build to use it.

2. The 2.0 and 3.0 frameworks register MIME types and handlers in IIS that you'll need for ASP.NET and WCF.  You can run aspnet_regiis in the 2.0 and 3.0 folders to get this done manually.  Your 4.0 framework project will have issues if these things were not done.

-JY

Thursday, February 27, 2014

WCF Concurrency and Instances

A quick tidbit regarding WCF in terms of performance.

You may be used to using the [ServiceContract] attribute to decorate your WCF method without giving it a second thought.  Then as time goes on you may notice that as more users hit your service the performance isn't what you were expecting.

One of the drawbacks to just using [ServiceContract] is that be default it is to create a new instance of your service which only handles one call.  This is fine if you need to do session management but in many cases a REST service will not.

A good practice is to decorate your methods with [ServiceBehavior] which allows you to change how the service operates. By using this behavior and settings the InstanceContextMode to Single you now have a service where all clients calling will use the same instance.  So if you are caching data then the performance from that can be realized.  In the default mode each client gets its own instance which takes up memory and doesn't serve anyone except the same caller on sequent calls.

Note: You thread safety becomes important in this case.

You can also change the ConcurrencyMode to determine how a call is handled.  In Single mode only one call is handled at a time.  So if you had a method that returned a number that is incremented continually, no call would get the same number because each call would wait until previous calls completed.

In a Multiple concurrency mode you would have to wrap the incrementing number with a lock or other thread safe mechanism otherwise you might get a conflict or deadlock because too many operations are trying to perform operations against the simple number field.

There is a 3rd mode called Reentrant which would be a special case that I've chosen not to cover here.

So in summary, look at adding the InstanceContextMode and ConcurrencyMode attributes found in the ServiceBehavior attribute instead of just a ServiceContract attribute to better control how your WCF service operates and to tweak performance based on how you need to service to scale/perform.

Wednesday, February 12, 2014

GAC Priority over Local

Something to keep in mind that came up today on a project.  If you have a DLL installed in the GAC and the same DLL in the local folder of the application then GAC will take priority.

In the case we ran across today there was a wrapper DLL given to multiple teams and someone installed the DLL in the GAC.  There was a bug that was found and updated in the wrapper but the DLL in the GAC wasn't updated.

This would be a good reason to version your DLL as well.  If the version in the GAC doesn't match the local folder then the local folder would be utilized.

Thursday, February 6, 2014

SQL Field Encryption

If you need to encrypt some data for security in SQL here is how you go about it...

1. You need to create a master key in your database if one does not already exist
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'YourPassword';

2. Create a certificate.
CREATE CERTIFICATE YourCertificateName WITH SUBJECT = 'Some Description';

3. Create a symmetric key.
CREATE SYMMETRIC KEY YourKeyName WITH ALGORITHM = DES
ENCRYPTION BY CERTIFICATE YourCertificateName;

4. To encrypt a field. Example below will encrypt each YourField column in table YourTable. NOTE: Be sure to set your encrypted fields to varbinary(128) so you can see the field value. Otherwise the field may appear blank.
OPEN SYMMETRIC KEY YourKeyName DECRYPTION BY CERTIFICATE YourCertificateName;
UPDATE YourTable SET YourField = EncryptByKey(Key_GUID('YourKeyName'), YourFieldName);
CLOSE SYMMETRIC KEY YourKeyName;

5. To decrypt your field. This will put the field YourField back to plain text in the output query.

OPEN SYMMETRIC KEY YourKeyName DECRYPTION BY CERTIFICATE YourCertificateName;

SELECT CONVERT(VARCHAR, DecryptByKey(YourField)
    AS 'Decrypted Field'
    FROM YourTAble;

 CLOSE SYMMETRIC KEY YourKeyName;

Wednesday, February 5, 2014

WHS 2011 Web Access Setup


In order to be able to access your WHS 2011 server remotely via the web you'll  have a couple of setup steps.

1. Enable Web Access. Open the Dashboard and one of the tasks will be enabling the web access.  It will enable it but it may fail due to router settings.

2. Enable port forwarding on your router.  Technically, you could DMZ the server but its not secure or the preferred method.  Port forwarding is a better option.  You'll want to record the internal IP that your server is on and forward the appropriate ports to your server.

Ports to forward:
443 is HTTPS which is what the remote access site included with WHS will run under.
3389 is the RDP port if you want to remotely access your server using Remote Desktop.
80 is HTTP is you're going to host a website or something.

There are other ports for mail, ftp, etc if you're interested in enabling those for your purposes.

If you're looking for a free solution to access your server with a friendly name rather than the public IP (which can change) I would recommend looking at No-Ip.com.  They have an executable that can run as a service to update their DNS when yours changes.  So you get a friendly IP even without a static public IP and its free.


Sunday, February 2, 2014

Windows Home Server 2011 Client Connector Woes

I tried to fire up my original Windows Home Server (WHS) machine but one of the drives was failing and the system wouldn't boot due to file corruption.

So I built a new box running WHS2011 which I was able to buy for $50 at the local computer parts store.  That version is basically a flavor of Windows Server and even has IIS installed so its great for a home development box for whatever projects you have.

Unfortunately, upgrading wasn't simple.  The new connector software complained there was already a version of the connector installed.  The version installed didn't want to uninstall.

I tried to manually delete the files and some registry entries but the new connector still complained.  Additionally, it caused my active window to lose focus because an installer was trying to load up constantly.

After a few hours I figured it all out.  I went through every entry in the registry with "Windows Home Server" and deleted it.  This made the connector installer work, but it would fail part way through and rollback.

I found a Microsoft KB article about a service needing to be enabled and at least set to manual if not started.  The name escapes me but it was Media Center related.  As it turned out I deleted some registry entries for WHS related to Media Center thinking the new connector install would replace them.

To fix this issue I went to Programs & Features and uninstalled all the "Media Features" which required a reboot then installed them again.  Then the new connector installed no problems and everything works great.

Suggestions that might help you circumvent these issues:
1. Create a restore point on your system and save backups of the registry.
2. Uninstall the old WHS connector while your old server is still running or before upgrading the software. I think part of the uninstall issue might have been the uninstaller trying to remove the user from the WHS machine and couldn't because mine was offline.
3. Try using a different uninstall program like CCleaner.

Wednesday, January 15, 2014

WCF Service Response Time Slow?

Recently ran into an issue testing a WCF web service deployed on a server.  A call to a very light service was taking upwards of 15 seconds to return a response.

The service had logging which revealed the call into the service until a response was returned was a fraction of a second.

I decided to do a Fiddler2 trace against a test program to see what was happening and suddenly the service was returning nearly instantly.  I continued to test and everything looked fine.  I then closed Fiddler2 and then the slow response time returned!

What I ultimately discovered was Fiddler was un-checking the "Automatically Detect Settings" on the Internet Options.  This can be found under Control Panel\Network and Internet\Internet Options\Connections\LAN Settings.  That setting was apparently set as part of the configuration of my work machine.  So that was what was causing the slowness.

I later recalled you can fix this in your WCF bindings by setting useDefaultWebProxy to "false" which ignores this setting.