Intune has an integrated function to collect Logs that can really be helpful to troubleshoot issues on Windows Clients, but what to do, if the desired log is not part of this? Well, we can collect them pretty easily our self, using a custom OnDemand Remediation script in Intune and an Azure Blob Storage to save our logs.
Azure Storage Account
Let’s start with the Azure Storage Account as basis for our logs. If you already have an Azure Storage Account available for you, can you skip this part and jump directly to the preparation of the needed files and folders: Prepare storage account
In the Azure Marketplace search for “storage” and select “Storage account”
Give it a unique name and select the region, to where you want to deploy your storage account. Since the logs that I want to collect are not critical at all, I went with the cheapest option of Redundancy (LRS) to save some money.
For all other settings, select whatever matches your environment, for my test lab I went with the defaults.
Once you are happy with your settings, let us deploy the storage account:
Prepare storage account
Now that our storage account is ready, let’s prepare everything we need. We will create two containers, one for our resource files and one for our log files.
Simply click on “Add container” and create the two folders:
Next, we need to upload a copy of AzCopy to the resources container. We will use this tool later in the client script to upload our logs to the blob storage.
Now that AzCopy.exe is available, we need to create an SAS URL for this file.
Read permissions are fine for this file, select the duration of the download link, for this demo, I went with one year, modify to match your needs, just be aware, the log collection will stop working after this, and you will need to create a new SAS token and modify the script.
Once created, copy the Blob SAS URL, we will need it in the PowerShell script:
Last but not least, we need to create a second SAS token for our logfiles container.
Here we need write permissions to be able to upload our logfiles. Again, copy the Blob SAS URL for later usage.
Remediation script
Now that we have all prerequisites fulfilled, we can start with the remediation script to collect the logs. The script itself is using the following logic:
-> Define storage account properties
-> Create Temp Folder to collect Logs
-> Copy Logs to Temp Folder (Sample script collects logs from the Windows App)
-> Compress Logs
-> Download AzCopy Tool
-> Upload Logs
-> Cleanup
You can find the sample script on my GitHub: Scripts/Log Collection.ps1 at main · mmeierm/Scripts (github.com)
You can obviously modify the script to collect everything that you want, just modify the part from line 8 to 24 to copy the files you want to “$Path” Variable.
Once you modified everything that you wanted, we just need to copy our two previously saved SAS Blob URLs into the script:
Intune Remediation
Now that we have our script ready, we can finally upload it to Intune, so that we have it available when needed.
Give it a name:
Upload the script file as detection script and make sure to select “Run script in 64-bit PowerShell”
Assign Scope Tags if needed, no need to assign the remediation, as we are using it on demand:
Run the remediation
To run the log collection, select the device and select Run remediation:
In the list select our freshly created remediation:
And wait patiently for our logs to appear on our Storage Account.
A few min later, we have our expected logs from the Windows App available for us to troubleshoot:
Conclusion
Having something like this available in your toolbox makes troubleshooting for me so much easier, since you don’t have to rely on the user to get you the logs for an issue they reported.
Leave a Reply