How to increase your code performance when creating SharePoint functionalities? 

One of the biggest concerns for the actors that have roles in the development process of a software product is time.

If "How long will it take for the process to be completed?" or “What can we improve in order to get good results in the desired range of time?” are some of the questions that raise in your mind, then this article may help you.

Even if you are a SharePoint developer who has a deadline, a business owner that needs things to be done faster or just someone curious about how script performance can be increased, this article is aimed to you. 

Maybe you are already asking yourself “Why would I need to speed up my scripts performance in order to get things faster?”. A simple answer would be that time is precious nowadays and time is money. For sure some of these things has a significant importance to you: 

  • the quality of code provided to the client;      
  • how fast your employees will benefit of an intranet solution;      
  • the range of time your business will start to be in line with other companies;      
  • means of becoming more efficient. 

One of our articles is talking about PnP and how useful it is when developing SharePoint solutions. PnP is a great way to perform actions towards SharePoint online as well as on-premises because of the speed of things that we can do by writing scripts and running them, comparing to doing the same task manually.

Powershell cmldlets are also useful when it comes to automated processes. PowerShell contains commands that can be easily used to develop features for SharePoint and it is a popular language in the Windows ecosystem. That flexibility and ease of use can sometimes come at a cost.

Developing in SharePoint by writing code is indeed faster than doing it manually, but there are means of doing it more efficiently. 

Being a SharePoint developer can lead to both easy and hard to manage situations. The easy way is built by the Microsoft team that provides us the entire PnP library very nicely structured, with examples about how to use each command. Also, these commands are easy to learn especially after writing them a couple of times. The hard way is due to the fact that SharePoint is not 100% customizable or some features that you may want to implement are no longer supported and you need to find gateways in order to obtain what you want.

Another thing we assume that you have noticed during script deployment is that they can run really slow sometimes. It is ironic that often we use PnP or PowerShell to accelerate a manual process and we end up finishing it slower than we expected. 

But there is an explanation for this. The following points may be possible sources of why your script runs slow: 

1. The amount of requests 

The main reason why things are slow with PnP is because each command represents a request.   Let’s say that we want to get a list from the current web. We need a command such as: 

 Get-PnPList -Identity “target-list” 

Basically, the process flow looks like this:

  • request to the current web to get the list
  • a search/filter process takes place
  • program is waiting for response in order to go forward
  • response is received
  • the script follows its course 

Make sure that you are not making requests that are not necessary. 

2. Available resources 

If you are working with SharePoint Server, the reason your script may get slower is that the machine doesn't have enough resources. If the script is executed on a local farm that has only 2 processors, that might be a good thing to be looking at. The minimum required for SharePoint Server is 2 processors, but the best choice is 4. The machine is more likely to be running out of resources when only two processors are allocated. It’s recommended to have 16 GB RAM, 64-bit 4 cores, 80 GB for system drive, 100 GB for second drive. 

3. The crawler  

Another element that might be slowing you down is the crawler. Crawling is the mechanism that your site uses to scan all the documents inside of it. Indexing is the process of sorting and integrating this information into its search database. Once a document has been indexed and crawled, it is added to the search index of the site, making it eligible to show when a user performs a search related to the content.

If you are adding new content and crawler starts to crawl same library, it may significantly impact performance. In case of migration best practice is to pause crawler until you complete the migration. 

4. Loops 

Loops are commands that we use a lot when writing PnP or PowerShell scripts.

There are several ways of writing for loops:      

  • for      
  • foreach
  • Foreach-Object

Looping through object with foreach is faster that looping with the rest of the options. 

5. The complexity of your command 

Each operation means a query to the database, and the most expensive is SELECT! Keep in mind that you are searching more often than inserting, deleting or updating.

6. Error Checking 

For example, you want to check if a list exists, and if not to create it: 

 $list = Get- PnPList -Identity <String>  -ErrorAction SilentlyContinue             if(!$list)  


  $createList = New-PnPList -Title <String> -Template <ListTemplateType>             } 

While you don’t see the error, it is still being thrown in the background if the list does not exist, which is causing some performance impact. This also should be considered when you evaluate the script performance. 

After all these were mentioned, we can always improve. Find in the next section solutions that you can try 

The following solutions may not have a significant improvement on small scripts or scripts that iterates through small number of objects, but still are good to consider for both code improvement and efficiency. 

1. Get your script functional first 

This may seem obvious, but before you try to optimize your script, you should first clean it up and make sure it’s fully functional. Make your code clean and while doing that keep in mind that somebody new to the project needs to understand it. You don’t want to waste time trying to optimize a script that won’t work because that is just an effort without a result. Always have in mind this programming essential rule: keep it simple. There is no need to develop more than it’s required because most of the times maybe that feature will not be needed. So, make a functional script with a clean code before of all. 

2. Measure how long your commands takes to finish 

Time tracking your script part it’s essential, such as a loop or even a part of a loop, will provide you valuable insight into where you need to focus your speed improvements. The solution is: Measure-Command

According to Microsoft Documentation, the Measure-Command cmdlet runs a script block or cmdlet internally, times the execution of the operation, and returns the execution time. In most cases, you can just run Measure-Command with your code inside the expression parameter script block and get what you are looking for. Use this command to quickly learn where most of the time was being spent. You can do this by using multiple instances of this cmdlet inside the loop, by dividing the loop into multiple sections, and the result will be exactly what you want: the part that takes the most time to finish. 


Also, consider that not every time the result will be the same. A solution to this would be to take the average time: 


3. Choose the most efficient for loop 

Here, the script connects to a site collection and stores into $siteCollectionLists variable all the lists that are found in the site. After that, the script iterates over lists using foreach, for and ForEach-Object by measuring the execution time expressed in milliseconds. As we can see, foreach command is the most efficient, with only 5.42 milliseconds. Mind that same command may take a different execution time on a different machine, or even on the same machine but another time. Measure-Command helps in detecting the part that is most costly. 


4. Divide the number of objects your script is using 

Scripts are used when it comes to perform a repetitive task. Let’s say that you have a script that iterates over 60 web applications, all its site collections and lists. This can take almost a week to be done, if each web application has approximately 100 site collections. A nice solution would be to divide the number of web applications. You can choose to run the same script from three machines using this scenario: the first machine will run the script for the first 20 web applications, the second machine for the next 20 web applications and so on. In this way, the job will be done in less than three days. The execution time is cut by more than a half. 

5. Avoid keeping in memory objects that you don’t need 

 a) $siteLists= Get-PnPList (returns all lists in the current web) 

Be aware that the object $siteLists will hold not all the information regarding site lists, it’s not just an array of lists titles. This may be a well-known thing but still, it is good to mention. 

 b)      Make sure that you get the objects once, before the iteration begins: 


 c)      Use the most efficient property of the object: 

For example. Use $list.AddItem() instead of $list.Items.Add() . list.Items.Add()   is not the best way because it consummes SQL resources to read all records in the list (may be 1000s of items), network between SP and SQL, and SP RAM to load all these items... list.AddItem() is definitely the way to go. 

6. Dispose your objects 

More details about how to dispose objects in SharePoint on premises can be found here. You don’t need this in for online version because SharePoint handles this by himself. 

7. Arrays and the += assignment operator 

It is very easy to read and understand what is going on logically, as it can be seen below. But there are some serious performance penalties lurking in this pattern. 


This approach can be very inefficient because arrays are immutable. Each addition to the array creates a new array big enough to hold all elements of both the left and right operands, then copies the elements of both operands into the new array. For small collections, this overhead may not matter. For large collections, this can be an issue. 

So, each time your loop runs and calls += what happens is: 

1. A new array is created in memory.

2. The old array is copied to the new array with the value being appended at the end.

3. The old array is discarded in favor of the new array. 

A solution to this would be to make the loop the value of the variable 


Another one would be to use ArrayList  


8. Prioritize condition checks from most common to least 

This may sound obvious, but the order in which multilevel If conditions are checked can have an impact on a script’s speed. This is based on how many conditions checks you have and which condition is likely to occur the most often. Windows PowerShell will stop checking the remaining conditions when a condition is met. 


In this example, the script-block represents an iteration over an array of strings. Because the item “yellow” repeats more time, it is more efficient to check for this item in the first condition. 

9. Avoid Write-Host 

Using Write-Host to display information in the console is considered a poor practice. But sometimes it makes sense to use it, for example when debugging or when you want to highlight the message. Instead of using Write-Host, consider using Write-Output

Here other alternatives to Write-Host: 

  • Verbose messages: Write-Verbose
  • Warning messages: Write-Warning
  • Error messages: Write-Error
  • Debug messages: Write-Debug 


We have mentioned some sources that can harm script efficiency, but we also pointed out ways of increasing code performance. In some situations, you may not find a difference in runtime due to using a small dataset. Also, keep in mind that different environments can show a different execution time.

Also, it is good to work on both code functionality and performance, even if it is about a small change into the app. You never know when a new decision comes and all these will matter.

By improving your script performance, you basically cut a significant amount of time, for example when testing or deploying a script.

Which of these techniques are you going to use?

For sure the one that suits your code better. 

Check out: