In this post I will cover a few tips for getting started with using ProcMon (Process Monitor in the Sysinternals Suite) for troubleshooting long running processes. Note that I am not an expert in ProcMon by a long shot, so this is more of a selfish post to remind myself of some key settings to be sure to configure.
Side note. You can run the Sysinternals tools from the web at Sysinternals Live without needing to download the tools to your local machine. This is useful if your customer / organization doesn’t allow installing / running 3rd party tools or has concerns about running them directly on a machine.
Skip down to the Tips section if you don’t want to read the back story.
At least once a month I have a customer scenario where two or more applications are not playing nice with each other. A .Net website and anti-virus software, SharePoint and server backup software, etc. Usually the problem involves one piece of software placing a write-lock or exclusive hold on a file / registry entry while the other software expects the same. In such a scenario we need to monitor a file / registry entry from a fresh start (restart application pool, process, etc.) until the file / registry access error happens which could be hours or days. Since we are monitoring for such a long time we want to make sure that ProcMon only captures the data that we need as well as be mindful of memory / disk space usage.
1) Start ProcMon with no tracing
If you start up ProcMon by double clicking the executable ProcMon will start capturing data immediately. Instead launch it from the command line with the /noconnect parameter.
c:\sysinternals> procmon /noconnect
2) Specify a backing file
By default ProcMon will store event data in virtual memory. Since we could capturing hours or days worth of data it might be preferred to store that to a disk with lots of free space (multiple GBs or more depending on expected duration and number of events).
Navigate to File –> Backing Files… for these settings.
Change the radio button from “Use virtual memory” to “Use file named:” and then specify the filename you would like to use as a backing file. .PML is the default extension used so I followed that convention.
3) Apply filters
By default all processes accessing any files and / or registry locations will be monitored. Instead we want to filter events based on our criteria. The most common scenarios that I run into (in order that I see them) are 1) filtering for location when we don’t know the process that is locking it or 2) filtering for a specific process when we know the process but not the location that is being locked.
Click on the filter icon in the top menu or click Filter –> Filter… to access these settings.
In the example below we filter for any path that begins with “c:\MyAppFolder”. By doing this we include any subfolders of our application folder.
4) Drop filtered events
By default ProcMon will capture all events whether they are filtered or not. To save on memory (or file space if you use a backing file) you can drop filtered events. There is a chance that if you created your filter incorrectly you may miss the events needed for troubleshooting but by keeping your filter broad enough that shouldn’t be an issue.
Click on Filter and select the Drop Filtered Events menu item. If there is a check mark next to the menu item then you have correctly configured ProcMon to drop filtered events.
In this post I walked through a few quick tips for configuring ProcMon when you need to troubleshoot a long running process. Hopefully these tips will help you avoid out of memory issues or having to parse through hundreds of thousands of events.
Special thanks to my peer and teammate Ken Kilty for providing the research and background info for this blog post.
<Update 2015/7/28 2:30pm> I received clarification that the SharePoint product group does support installing .Net 4.6 onto an existing SharePoint 2013 farm server. It is the installer for SharePoint 2013 that will fail to detect .Net 4.5 if .Net 4.6 is already installed and thus throw an error. A future update should correct this scenario with the installer.
On a related note I was able to successfully uninstall .Net 4.6 from a server (remove the KB as mentioned at bottom of this post) and then install SharePoint 2013.
Quick publish on this item and I’ll update once I have more details. One of my customers is exploring Visual Studio 2015 / .Net 4.6 which was just released a week or two ago. During some testing I found out that (as of July 28 2015 when this is published) you cannot install SharePoint 2013 binaries onto a server that has .Net 4.6 (or Visual Studio 2015 which includes .Net 4.6) installed. I received the below error message.
Since .Net 4.6 is an in-place upgrade of .Net 4/4.5/4.5.1/4.5.2 SharePoint has an issue with finding .Net 4.5.x after applying 4.6. I am testing out removing the associated KB for .Net 4.6 to see if this is reversible should you accidentally deploy this to a dev / test farm. I’m also testing if you can install .Net 4.6 / Visual Studio 2015 onto an existing SharePoint 2013 farm.
Removing associated KB…
- On Windows Vista SP2 / Windows 7 SP1/ Windows Server 2008 SP2 / Windows Server 2008 R2 SP1, you will see the Microsoft .NET Framework 4.6 as an installed product under Programs and Features in Control Panel.
- On Windows 8 / Windows Server 2012 you can find this as Update for Microsoft Windows (KB3045562) under Installed Updates in Control Panel.
- On Windows 8.1 / Windows Server 2012 R2 you can find this as Update for Microsoft Windows (KB3045563) under Installed Updates in Control Panel.
Download .Net framework
Hopefully this helps someone before they run into issues with their farm. Feel free to leave a comment if you find out any additional details or workarounds.
This was the first year of the Microsoft Ignite conference which merged a number of previous conferences including TechEd, SharePoint Conference, Project Conference, and more. With over 23,000 attendees, a new venue, and numerous Microsoft senior leadership and product group in attendance (including CEO Satya Nadella himself) this was definitely a huge event. Rather than re-capping the event itself I wanted to take a minute to mention a few items that I heard / saw at the conference. I am still downloading and viewing a number of sessions that I couldn’t attend (same time as another session or room was at capacity) but these are highlights that I wanted to share with others.
- No “internal” FIM in SharePoint 2016 - SharePoint 2016 will not ship with a version of the Forefront Identity Manager product included. This is a fairly big deal for any customers that are using the “SharePoint Synchronization” option (allows for import and export of content to / from SharePoint) for the User Profile Sync in 2010 or 2013. Your options in 2016 will be the Active Directory Import (same as 2007 and re-introduced in 2013) or “external” FIM which is installed and managed outside of SharePoint Server. See the following resources for more details and how to install FIM 2010 R2 + SP1 with SharePoint 2013 so that you can start planning today if you do need the full features of syncing data into and out of SharePoint.
What's New for IT Professionals in SharePoint Server 2016 (session recording with announcement)
Configuring SharePoint 2013 for the Forefront Identity Manager 2010 R2 Service Pack 1 Portal (install overview)
- Project Siena – Project Siena looks like a viable alternative (not replacement) for many (smaller) custom development scenarios. Essentially it is an app that lets you build other apps. I do not see this replacing InfoPath, Lightswitch, and half a dozen other technologies that have popped up over the past few years but I do see a promising future for this technology (HTML5 + JS based, similar to many other tech stacks that Microsoft is promoting). Note that it is still in a beta release last time I checked but the fact that it caters to the Excel power user with similar syntax merged with an easy drag and drop interface feels like this could gain traction better than some other tools. If you aren’t familiar with Project Siena you really need to see it to understand it.
Microsoft Project Siena: Build Apps and Create New Mobile Solutions (session recording with demos)
Microsoft Project Siena (Beta) (product site)
- New SharePoint hybrid search option - Hybrid search is receiving a huge update / upgrade later this year. In it’s current (May 2015) form SharePoint hybrid search involves separate search service applications / indices for on-prem farms and Office 365 / SharePoint Online. If you query one source you can federate the query to the other and get results in a separate result block. The problem though is that configuration can be fairly complex, search results aren’t integrated (in-line with each other), and you likely have a large number of servers on-prem for the search service. Later this year (target timeframe, subject to change) Microsoft will release an update which will allow an on-prem “cloud search service application” to crawl and parse content but then push the metadata up to Office 365 for indexing, querying, etc. The massive benefit of this is that your on-prem content will then be able to be used in other services like Delve, Office 365 data loss prevention (DLP), and others that currently have no expected on-prem release (or won’t be supported until future releases of SharePoint). Additionally you will need a much smaller on-prem server footprint to support search (the example given was going from 10+ search servers down to 2). This is a big win in my opinion and I can’t want to test it out when it is released.
Implementing Next Generation SharePoint Hybrid Search with the Cloud Search Service Application (session recording)
- Nano Server – Nano Server is a new installation option for Windows Server 10 (Server 2016 or whatever the final name ends up as) akin to Server Core in the past. There were a number of sessions that talked about how small the footprint of Nano Server will be (400MB, yes MB compared to 8+ GB of server + GUI “full” edition). The changes that this introduces not only affect performance but also re-architecting tools to work remotely (there is no local logon or UI for Nano Server, everything must be done remotely). Things like Event Viewer, Task Manager, Local Services, etc. can be accessed remotely in a web UI similar to the “new” Azure Portal UI (super slick, take a look). This may sound scary to some admins who are used to having RDP or locally logging on to a server but listen to Jeffrey Snover’s take on this. We are IT Professionals and this is a technology that will reduce number of reboots, make servers more secure, reduce infrastructure footprint, and have numerous other benefits. You owe it to yourself and your company to learn about this and see if it will work for the services you provide.
Nano Server (session recording)
Nano Server: The Future of Windows Server Starts Now (session recording)
Remotely Managing Nano Server (session recording)
- PowerShell – Getting to see Jeffrey Snover (inventor or PowerShell) and Don Jones (first follower of PowerShell, see the link in slide deck) geek out about PowerShell was one of the best sessions I got to see at Ignite. Hard to describe in words hence I recommend go watch the recording. Jeffrey had some great advice about using PowerShell as a tool to explore and dive into problems or scenarios you are trying to solve. That sense of adventure can be a motivating force for your personal and professional careers. It was really inspiring and I love the fact that Jeffrey (and Don’s) mindset is spreading to so many others these days.
Windows PowerShell Unplugged with Jeffrey Snover (session recording)
On a side note I also wanted to mention one of the obvious but not always talked about benefits of going to a conference like this in-person. During the week I was able to introduce myself to a number of presenters that I had previously not met. Some were MVPs, fellow Premier Field Engineers (PFEs), product group members, and more. The connections you make can last for years and provide an invaluable network for sharing information and getting assistance when you are in need. I even got a PowerShell sticker directly from Jeffrey Snover himself (another personal highlight).
This is just a short list of some of the sessions that I attended along with highlights or key points that I wanted to share. If I find anything else significant from the recordings I am going back to watch I’ll update this post. For now though go check out the recordings above or the hundreds of other ones that are up on Channel 9. I encourage you to attend next year when Ignite 2016 will be in Chicago again May 9-13.
This is my fourth year presenting at the SharePoint Cincy conference. As usual the crew that organizes has put on a great conference and the attendees were very engaged. Below are my slides and demo scripts for my “Running Your Dev / Test VMs in Azure for Cheap” session. Thanks for all who attended and hope that you got something useful out of it.
Demo PowerShell Scripts
Blogging this as a simple reminder to myself on the default (out of the box) value of the Search Service Application index location in SharePoint 2013. Invariably I have to look this up every couple of months when supporting customers and only ever find the PowerShell commands to retrieve it. Putting both on here. Hopefully this saves someone else a few minutes of their day as well.
$ssi = Get-SPEnterpriseSearchServiceInstance
$ssi.Components | Select-Object IndexLocation
C:\Program Files\Microsoft Office Servers\15.0\Data\Office Server\Applications
I was pleased to once again speak at the Dog Food Conference here in Columbus, OH. I believe this is the 3rd year that I have spoken and the 4th or 5th year that I have attended. The venue has moved to a more spacious location which definitely helped with giving attendees, speakers, and vendors more room to spread out, I was especially happy to meet up with dozens of previous customers and co-workers at the conference. This really is a great mix of audiences (developers, IT pros, and business users), customer segments, and topics (SharePoint, .Net, PowerShell, BI, ALM, and more).
Thanks to everyone who attended my session at the very end of the last day of the conference. We had a number of good side discussions and questions throughout the presentation. Below are my slides and scripts.
Demo PowerShell Scripts
Many years ago I posted How I Blog walking through my blogging process. Over the past few months many of my coworkers and customers have been talking or asking about how to use Azure IaaS for dev / test environments (especially for SharePoint). In this post I’ll walk through the configurations I use, tools that have helped me, and other tips.
Note: This is not meant to be a post on best practices for rolling out your Azure IaaS infrastructure to support SharePoint. This is just my current setup as an reference example for others to learn from. For some best practices please read Wictor Wilen’s post on Microsoft Azure IAAS and SharePoint 2013 tips and tricks and listen to the Microsoft Cloud Show podcast interview Episode 040 - Talking to Wictor Wilen about Hosting SharePoint VMs in IaaS he participated in.
For over 6 months now I have been running my primary set of lab VMs in Azure Infrastructure as a Service (IaaS) VMs. Prior to using Azure VMs I had been using Hyper-V on my laptop (either dual booting into a server OS or the latest iteration on Windows 8 / 8.1) but was always limited by machine resources. Even with a secondary (or tertiary) solid state hybrid drive (this is a newer version than what I currently have in my laptop), 24GB of RAM, and quad core i7 it seemed like I was always juggling disk space for my VHDs or CPUs / RAM for my VMs. Battery life in a hulking laptop like I had is very short and the weight can easily cause strain on your back when carried in a backpack. Nowadays I carry a Lenovo T430s which cut the weight down to almost 1/3 of my old W520.
I host 4 SharePoint farms (along with a few one-off VMs) in Azure IaaS using my MSDN benefits. My MSDN benefits include $150 Azure credit per month, 10 free Azure websites, and a host of other freebies. My farms includes a SharePoint 2007 farm, a 2010 farm, and two 2013 farms. I tried to make the configuration between farms as consistent as possible. As such I have a single Windows Server 2012 domain controller that also hosts DNS for all of my VMs and a similar 2 server topology for SQL Server and a SharePoint App / WFE server in each farm.
Note: The names and sizes for Azure IaaS VMs have changed since I first rolled them out (remember the days of small / medium / large / extra large for you early adopters?). The Azure folks seem to have standardized on a naming schema of “A” followed by a number now which I appreciate but there is always a chance that things could change again in the future. As such the names and sizes I list will be what is currently offered.
- Domain Controller – I use an A0 instance (shared core, 768 MB) running Server 2012 with a (mostly) scripted out configuration. This includes all of the SharePoint service accounts, OUs, and test accounts that I use. If something were to happen to my domain controller I could easily spin up a new one very quickly. I have talked with a few coworkers who host their AD infrastructure in a Windows Azure Active Directory instance. I am not as familiar with AD in general so I stick with what I know and get by with the PowerShell commandlets I need to spin up my domain controller.
- SQL Server – I typically use an A2 instance (2 cores, 3.5 GB) running SQL Server 2012 (except for SharePoint 2007 which runs SQL Server 2008 R2). If I am trying to roll out business intelligence features like SSIS, SSAS, etc. I will increase the VM up to an A3 (4 cores, 7 GB). I also have a SQL Server 2014 VM for my secondary SharePoint 2013 farm which is running the latest and greatest of everything (OS, SQL, patch level, etc.).
- SharePoint Server – I run both SharePoint WFE and APP roles off a single A3 instance (4 cores, 7GB) running Server 2012 during normal operation. Similar to my SQL limitation on the SharePoint 2007 farm I am running Server 2008 R2 for that farm. For my 2013 farm this machine also hosts Workflow Manager 1.0 and Visual Studio 2013. If I need extra horsepower or am running all SharePoint services (or even just SharePoint search) I will increase this to an A4 (8 cores, 14GB). I also have (almost) the entire SharePoint install and configuration scripted out.
- Office Web Apps – While it is technically not supported to run Office Web Apps Server 2013 in Azure I do have a working scenario for one of my SharePoint 2013 farms. This VM is an A2 instance (2 cores, 3.5 GB) running Server 2012.
- I originally created a separate network for each farm, but after deciding to utilize a single shared domain controller for all environments I instead switched over to just a single virtual network. This happened before the announcement of being able to create connections between virtual networks and I haven’t revisited this item.
- Special tip on virtual networks. When I first configured a virtual network in my environment I removed the Microsoft DNS IP address that was automatically assigned and instead put the “local” IP address for my domain controller. This resulted in losing outgoing internet connectivity for all of my VMs. I had to delete and recreate my virtual network with a working scenario (see below) of the default Microsoft DNS and then my domain controller IP to handle intra-VM communication.
- Azure portal - https://portal.azure.com/. This is the new Azure portal design that presents a customizable view into your Azure components. Not all of the features are currently supported but Azure web sites, IaaS VMs, and a few others are accessible. For the older “full” portal site check out https://manage.windowsazure.com.
- Azure Commander – This is a universal app for Windows 8.x and Windows Phone 8.x from Wictor Wilen. This app costs a few bucks but is absolutely worth it in my experience. You can import your subscription file and then be able to stop, start, and view your VM instances (along with a few other Azure resources) on either your phone or PC. I find this very helpful when I am presenting at a customer or conference using my Azure VMs and then need to pack up quickly and head out the door or to my next session. I can quickly and easily stop my VMs from my phone as I head on my way.
Windows Store: http://apps.microsoft.com/windows/app/azure-commander/9833284f-a80c-45ec-8710-a5863ec44ae4
Windows Phone Store: http://windowsphone.com/s?appId=39393973-0b68-4201-8ca1-a67af69f5fca
- Portability – As mentioned previously I no longer need to lug around a laptop + power adapter that are 10+ pounds. Instead my new laptop and power adapter are closer to 4 pounds. I can also launch my VMs, kick off some processes, and then shut my laptop and go to another conference room or head home. When I get there to my destination I pop up my laptop and my VMs are still running and ready for me to resume work quickly.
- Flexibility – When I need to tear down and rebuild farms / servers I don’t have to worry about storage or other resources. Previously when I wanted to rebuild a farm I would have to move VHDs or delete old ones in order to make space for the new set before I could delete the old set.
- Connection speed – The primary source of software and applications on my VMs comes from the MSDN subscriber downloads. There must be a mirror of all the products sitting in the rack next to my VMs because on average I get download speeds of 10-50MB/s (yes that is megabytes not megabits). Download a copy of the SQL Server 2014 ISO takes minutes instead of hours now.
- Cost – As mentioned above my MSDN benefits include $150 per month to use as I see fit on Azure. On average I run my VMs for 5-8hrs a day for 5-10 days in a month. Overall with storage, bandwidth, compute, and other costs I rarely spend more than $50 of that $150 credit. Compared with the electricity costs that I could be incurring from running local VMs in Hyper-V I’ll take my “free” MSDN VMs any day.
- Lack of snapshots – Currently there is no concept of taking a snapshot of a running VM in Azure. You can shut down a VM and copy the VHD blob to another storage account or download a local copy (quite a hefty download) but these don’t work as well when you need to set up a demo and capture it at a specific point so that you can roll back if needed. With the pace of how quickly the Azure team is rolling out features and new innovations who knows if this might make it into a future release.
- Require internet connection – Despite being in the age of fairly ubiquitous broadband and 4G signal there are still times that I am without a good internet connection. As a backup plan when I don’t have an internet connection or good 4G signal on my MiFi device I do have Hyper-V with a local set of VMs on my laptop. I use them maybe once a month if that.
- Limit of cores (for MSDN subscribers) – This is not a limitation for me but it is worth mentioning since some of my fellow PFE coworkers have brought it up. If you are using your MSDN Azure benefits you are limited to a total of 20 cores allocated to your VMs at one time. The most I use at any time is 14 and that is when I have increased the size of SQL and SharePoint and have Office Web Apps running which is rare. For some of my coworkers though they need to run lab environments with a dozen or more VMs to replicate deployments of Lync, Exchange, AD, SharePoint, and more and can easily use upwards of 80 cores. For them their MSDN benefits are not sufficient to run their lab environments.
I was hesitant about using Azure VMs due to fears about racking up costs that I (not my employer) would have to pay along with having to learn a new platform and set of tools. Now I couldn’t imagine having to go back to running my lab environments locally full time. Hopefully some of the tips and processes I covered in this post will encourage you to check out Azure as a replacement for your on-prem dev / test lab environment. You can even get a 1 month Azure trial to try out $200 worth of services.
<Update 2014-08-18> The Office App Model Samples project has been transitioned over to the Office 365 Developer Patterns & Practices GitHub repo. Please use that location going forward for any references.</Update 2014-08-18>
During the SharePoint Conference 2014 I had the pleasure of meeting Vesa Juvonen (@vesajuvonen) and Steve Walker (Linked In) after their session “Real-world examples of FTC to CAM transformations” (video). This was a very valuable session to attend discussing examples of full trust code (FTC) solutions that were re-implemented / re-imagined as app model apps. They also mentioned a new CodePlex project gathering community app model samples called Office App Model Samples (Office AMS).
Over the past few years I’ve been toying around with various PowerShell scripts to enumerate permissions in an on-premise SharePoint farm (Enumerate SharePoint 2010/2013 Permissions, Enumerate SharePoint 2007 Permissions). I was curious to see if it was possible to enumerate permissions in a SharePoint Online tenant as well. I had tried using the official SharePoint Online Management Shell commandlets, Gary LaPointe’s custom SharePoint Online commandlets, and my own client side object model (CSOM) PowerShell queries with no luck. Looking through Gary’s source code though I found a way to get the permission information I needed via C# code and CSOM. This felt like a great idea to submit to the OfficeAMS project.
I’m happy to announce that my submission Core.PermissionListing is now published in the OfficeAMS project. Keep in mind this is a rough proof of concept. The sample iterates through all non-My Site site collections (something I borrowed from another OfficeAMS solution) in a SharePoint Online tenant and lists out the permissions assigned to groups or users and specifies the permission assigned. The output could definitely be cleaned up but that will be an effort for a later date. Hopefully you will find this and other app model samples useful. If you’d like to contribute or improve upon a solution you find please contact Vesa, Steve, or myself.
I was pleased to present at SharePoint Cincy again for the third year. Geoff and all the organizers do a great job. My presentation this year was “PowerShell for Your SharePoint Tool Belt”. Below are my slides and demo scripts. Thanks for all who attended, I hope you found something that will be useful for you in your work.
Demo PowerShell Scripts
Recently I had a request from a customer to find which SharePoint 2010 / 2013 lists are using InfoPath forms for their data entry (also known as enterprise forms for a SharePoint list). In this post I will show you a PowerShell script to determine if a SharePoint list is using InfoPath forms.
As you may have heard, InfoPath as a product will not be receiving any future releases (see InfoPath roadmap update blog post). Being able to find SharePoint lists using InfoPath forms may be useful to you now.
Special thanks goes out to Joe Rodgers (fellow PFE at Microsoft) who helped me narrow down the specific properties to look at. The property that we want is not at the base of the SPList properties nor on the SPList.Forms properties like I had hoped. Instead you will need to dig a few levels down. I found the property at SPList.ContentTypes.ResourceFolder.Properties[“_ipfs_infopathenabled”]. If this setting is true then your list is using InfoPath forms for data entry. If it is false then it is using out of the box SharePoint forms.
$webURL = <Your Site URL>
$documentLibraryName = <name of document library>
$web = Get-SPWeb
$list = $web.Lists["$documentLibraryName"]
$isUsingInfoPath = $list.ContentTypes.ResourceFolder.Properties["_ipfs_infopathenabled"]
This script will determine if a single SharePoint list is using InfoPath forms or not. You could easily expand this to work with multiple lists or sites (similar to my PowerShell Script to Determine Number of Files in SharePoint 2010 or 2013 Document Libraries). Feel free to adapt the above snippet in this post to your needs but please attribute rights if you republish.