Blogpost: Sentinel RBAC options
Sentinel RBAC options
Main content
In that Blogpost, I’ll show you, the optional that you have to assign permissions to Microsoft Sentinel. There are different options available like:
I believe, there are more options available shortly and I’ll give you an update when they are available but let’s start with the current options.
1.) Default Microsoft Sentinel roles
In Microsoft Azure, there are many predefined RBAC (Role-based access control) roles available. In that huge list of roles, there are some Microsoft Sentinel roles included, which play an important role in each of my projects. Okay so let’s start with the list of roles, and then go a little bit more into the details.
There are more additional rules available, but in the list above, I’ve described the most important. Let’s go a bit more into the details.
Microsoft Sentinel Reader
Is the default view permission for Microsoft Sentinel. It includes the following permissions.
It’s the most common permission for Microsoft Sentinel viewers.
Microsoft Sentinel Responder
Is the default “work” permission for Microsoft Sentinel. It includes the following permissions in addition to the Microsoft Sentinel Reader role.
It’s the most common permission for Microsoft Sentinel Security analysts.
!IMPORTANT! to fulfill the work for Security analysts, you have to assign the “Logic App Contributor” permission too, which is described below.
Microsoft Sentinel Contributor
Is the default “change” permission for Microsoft Sentinel. It includes the following permissions in addition to the Microsoft Sentinel Reader/Responder role.
It’s the most common permission for Microsoft Sentinel Security engineers.
!IMPORTANT! to fulfill the work for Security engineers, you have to assign the “Logic App Contributor” permission too, which is described below.
Microsoft Sentinel Automation Contributor
That role is a special one. It’s not meant for user accounts, with that role you assign give Microsoft Sentinel the permission to assign playbooks to automation rules.
It’s required for Microsoft Sentinel
!IMPORTANT! once more, it’s not meant for user accounts!
Logic App Contributor
That role isn’t designed for Microsoft Sentinel only. It’s a default role for Azure Logic App and you should use that role in your Microsoft Sentinel architecture too. It’s required to execute playbooks for Security analysts and engineers
It’s required for Security analysts and engineers to implement SOAR (Security Orchestration, Automation, and Response)
Where can I find those rules?
Okay, the default rules are defined in detail, but where I can find those rules? It’s pretty simple, open the “Access Control (IAM)” section on your Microsoft Sentinel resource group and you will see all rules.
One additional IMPORTANT tip. Always assign an Azure-AD group to RBAC roles. It’s nothing new, but really important.
There are additional permissions available but not only for Microsoft Sentinel. Those roles are the “default” Azure roles
Keep in mind, when you use one of the RBAC roles above, you assign much more permissions as needed!
2.) Custom RBAC roles
In Microsoft Azure, you always can use the default roles (described above) or create your own custom RBAC roles. Meanwhile, there are different options available to create custom RBAC roles.
To implement custom RBAC role permissions, it’s required to know which Azure Resource Providers are available to create those permissions. The most important resource providers are.
The steps to define custom permissions at the Azure Portal is pretty simple, but you have to know, what you want and the permission list is really huge. Here are the steps to implement a new custom RBAC role.
To define new permissions inside the Azure Portal, please select your Subscription/Resource group or another level and go to the section “Access Control (IAM)“. When you select “Add” you find the option “Add custom role“.
In the first section from the Wizzard, you have to define the
Okay, the first two are clear, but the last section has different options. Let’s go to the details.
First, you can clone an existing role. Cool, and I often use that option.
The second is much more difficult, you can start from scratch, where nothing is predefined.
The last option is the hardest one but the most important, you can define your permissions from a JSON template. The cool thing here, it’s IaC and you can reuse that permission in another environment too. But in our case, we start from scratch and create a completely new permission.
Great, now we come into the most difficult section, we have to define the fine granular permissions based on the Azure Resource providers. I show you and “short” list of options that you have. For Microsoft Sentinel, the two Resource providers (list above) are the most important ones, so please think about, what you need and then select the permission.
Always keep in mind, you can define permissions for different “Actions” in Microsoft Sentinel.
The last step is also a really important one. In the “Assignable scopes” section, you have to define, where the custom RBAC permission is available. My recommendation here, always assign the permission subscription-wide or if you need it on other subscriptions too, management group-wide. In some situations, an Azure resource group scope could be fine, but it’s an exception.
3.) Custom RBAC roles assigned to tables
When you think about custom Microsoft Sentinel permissions, you can assign Azure Resource provider “actions” to Microsoft Sentinel, and to the Azure Log Analytics tables which are automatically created by Microsoft Sentinel and their “Data Connectors” too. In that section, I show you, where you can find those tables and permissions.
Okay, so let’s start, first, we have to find the automatically created tables. Please open the Azure Log Analytics store which is the foundation of Microsoft Sentinel, and go to the “Logs” section.
The whole Azure Log Analytics tables are shown, and you can see different areas.
Based on your connected solution or connected data sources in Microsoft Sentinel you see more and more tables. One IMPORTANT piece of information here, you can assign permissions to all tables except the custom logs this way! Damn, that’s not good, over that way, you can assign permission only to the whole “Custom Logs” and that’s not really good in many situations, but we will discuss this in the next section.
Where can I find those permissions? Please start with the same process described in the “Custom RBAC roles” above, but when it comes to the “Define the resource provider permissions” search for the Azure Resource provider
Only that Azure Resource provider includes the table permission actions. Here and shortlist of the options that you have.
4.) Custom table permission option
I’ve described the custom table permission that you have in the section above and also the “problem” with the custom table permission. When you query the permissions you can find the Azure Resource provider table permission named
which indicates the permission for the whole “Custom Logs” tables. Bad really bad, but there is another option. Where are “normally” do the custom logs come from? From custom solutions, and in many situations you can change the new item before it comes into the Azure Log Analytics store.
In my description, I show you the scripting (PowerShell) option that you have, but there is more. Think about the log collection based on CEF (Common Event Format) or Syslog. In that case, you have to define one or (better) more collectors which are sending the data from On-Prem to the cloud. Those forwarders also include the “ResourceID” from the collector and that’s important (I’ll describe this in the next sections). This scenario is named “Log forwarding collection“
With the “Log forwarding collection“, you have one problem! Think about the following situation
Your company installed two or more VMs to collect Syslog or CEF events On-Prem and send those events to the Azure Log Analytics Workspace. The Company has different firewalls in different locations (HQ, Bo) and also has firewall admins in those locations.
Now, we have a bad situation, all logs are sent over the same Azure Log Analytics collectors to Azure, which means, the same resourceId for all logs in the same custom log table.
When we have now the environment, each firewall admin can only see his logs, we have a problem. To solve that problem you can use the “Logstash collection” option.
But in my BlogPost, I want to demonstrate the option that you have with a custom build script over PowerShell. Why over PowerShell? From my point of view, that option brings you the flexibility to write code without the need for a development team. For sure, you can also archive the requirement with a .Net application, but let’s start with PowerShell first. We have the following situation.
You host an application On-Prem that has important pieces of information for your SOC team. It’s a custom-developed application and includes the option to get the required information over an API.
You write a custom PowerShell script with a scheduled execution (Azure Automation eg.) and write that information to Azure Log Analytics.
It’s pretty simple to archive that requirement because Azure LogAnalytics includes an API to send data over eg. PowerShell. Here is a sample script (excluding the resourceID)
#region Replace with your Workspace ID
$CustomerId = "<WorkspaceID>"
#endregion
#region Replace with your Primary Key
$SharedKey = "<WorkspaceKey>"
#endregion
# Specify the name of the record type that you'll be creating
$LogType = "CustomTablePerm "
# You can use an optional field to specify the timestamp from the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time
$TimeStampField = ""
# Create two records with the same set of properties to create
$json = @"
[{ "StringValue": "SharedWorkbook02-Data01",
"NumberValue": 42,
"BooleanValue": true,
"DateValue": "2019-09-12T20:00:00.625Z",
"GUIDValue": "9909ED01-A74C-4874-8ABF-D2678E3AE23D"
},
{ "StringValue": "SharedWorkbook02-Data01",
"NumberValue": 43,
"BooleanValue": false,
"DateValue": "2019-09-12T20:00:00.625Z",
"GUIDValue": "8809ED01-A74C-4874-8ABF-D2678E3AE23D"
}]
"@
# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
{
$xHeaders = "x-ms-date:" + $date
$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource
$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($sharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.Key = $keyBytes
$calculatedHash = $sha256.ComputeHash($bytesToHash)
$encodedHash = [Convert]::ToBase64String($calculatedHash)
$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
return $authorization
}
# Create the function to create and post the request
Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)
{
$method = "POST"
$contentType = "application/json"
$resource = "/api/logs"
$rfc1123date = [DateTime]::UtcNow.ToString("r")
$contentLength = $body.Length
$signature = Build-Signature `
-customerId $customerId `
-sharedKey $sharedKey `
-date $rfc1123date `
-contentLength $contentLength `
-method $method `
-contentType $contentType `
-resource $resource
$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"
$headers = @{
"Authorization" = $signature;
"Log-Type" = $logType;
"x-ms-date" = $rfc1123date;
"time-generated-field" = $TimeStampField;
}
$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
return $response.StatusCode
}
# Submit the data to the API endpoint
Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType
Okay, and what’s now a resourceID? Every time, when you create a resource in Azure, that resource has a unique resourceID that looks like the following.
You can use exactly that in your script but what are useful scenarios? Think about the following.
Your custom On-Prem solution (described above) and their events want to visit by their customers in Azure. You have now the option to bring those data to a separate Azure Dashboard.
Create a new Azure Dashboard, share that Dashboard, and use that resourceID in your script.
1.) Create a new Azure Dashboard
Define a custom resource group or use the “default” Azure resource-group to save the new Azure dashboard
Now copy the resourceID from the new created dashboard and add a new line named “x-ms-AzureResourceId” = “<resourceid>”; into the Post-LogAnalyticsData function.
#region Replace with your Workspace ID
$CustomerId = "<WorkspaceID>"
#endregion
#region Replace with your Primary Key
$SharedKey = "<WorkspaceKey>"
#endregion
# Specify the name of the record type that you'll be creating
$LogType = "CustomTablePerm "
# You can use an optional field to specify the timestamp from the data. If the time field is not specified, Azure Monitor assumes the time is the message ingestion time
$TimeStampField = ""
# Create two records with the same set of properties to create
$json = @"
[{ "StringValue": "SharedWorkbook02-Data01",
"NumberValue": 42,
"BooleanValue": true,
"DateValue": "2019-09-12T20:00:00.625Z",
"GUIDValue": "9909ED01-A74C-4874-8ABF-D2678E3AE23D"
},
{ "StringValue": "SharedWorkbook02-Data01",
"NumberValue": 43,
"BooleanValue": false,
"DateValue": "2019-09-12T20:00:00.625Z",
"GUIDValue": "8809ED01-A74C-4874-8ABF-D2678E3AE23D"
}]
"@
# Create the function to create the authorization signature
Function Build-Signature ($customerId, $sharedKey, $date, $contentLength, $method, $contentType, $resource)
{
$xHeaders = "x-ms-date:" + $date
$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource
$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($sharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.Key = $keyBytes
$calculatedHash = $sha256.ComputeHash($bytesToHash)
$encodedHash = [Convert]::ToBase64String($calculatedHash)
$authorization = 'SharedKey {0}:{1}' -f $customerId,$encodedHash
return $authorization
}
# Create the function to create and post the request
Function Post-LogAnalyticsData($customerId, $sharedKey, $body, $logType)
{
$method = "POST"
$contentType = "application/json"
$resource = "/api/logs"
$rfc1123date = [DateTime]::UtcNow.ToString("r")
$contentLength = $body.Length
$signature = Build-Signature `
-customerId $customerId `
-sharedKey $sharedKey `
-date $rfc1123date `
-contentLength $contentLength `
-method $method `
-contentType $contentType `
-resource $resource
$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"
$headers = @{
"Authorization" = $signature;
"Log-Type" = $logType;
"x-ms-date" = $rfc1123date;
"time-generated-field" = $TimeStampField;
"x-ms-AzureResourceId" = "<resourceid>";
}
$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body -UseBasicParsing
return $response.StatusCode
}
# Submit the data to the API endpoint
Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType
When you add data to your Log Analytics workspace, you find the new custom table including the _ResourceId entry for each line.
Now, you can give the customers permission to read exactly that and only that information from your Azure Log Analytics store. The only thing, what you have to do, is to assign the customer to the Azure Dashboard and add the Azure Log Analytics query to the shared Dashboard.
Finally
You’ve now seen a lot about Microsoft Sentinel and the RBAC permission. Some permissions are really simple and fast implemented, other permissions are more complex. If you want to see the differences in a Live Demo, look to my YouTube video below.
Get more information on my channels
Feel free to write me an E-Mail or ping me on the social channels listed below.