Microsoft Fabric: semantic models, dynamic row level security, and DirectLake mode

Some of you will be familiar with dynamic row level security (DRLS) in PowerBI. For those that aren't, DRLS allows us to use the email address of the user that is logged into the service via their Entra account. With this email address, and corresponding email address in our semantic model, we can dynamically filter the semantic model to only show the data that user could see.


The result, we can use one report suite to serve the same content to country, region, and global teams whilst simultaneously providing the necessary security around sensitive data. Now back in the world of Power BI, that was really easy. We fired up Power BI desktop, went into manage security roles, and added the necessary DAX:



Now in Fabric, that still works exactly the same way as it always has for import and DirectQuery semantic models. However, it's not so straight forward for DirectLake semantic models.


Why? The challenge is that a number of features break DirectLake connections and force the model back into DirectQuery mode. In fact, for existing Power BI developers, we actually need to change the way we work currently to guarantee that Fabric doesn't drop back into DirectQuery mode. That change is to not use desktop and instead do all your implementation in the service. I know, completely counter intuitive; but trust me, it makes your life easier in the long wrong. I would do this at least until DirectLake mode is fully supported within desktop and using features such as what if parameters doesn't result in DirectLake mode dropping back to DirectQuery mode.


With all these challenges, how do we implement DRLS with DirectLake mode in Microsoft Fabric? Before we begin, we have some general rules to remember:

  1. You must have a Lakehouse or Warehouse in Fabric containing one or more delta tables.
  2. Create any static data as physical delta tables within your Lakehouse. You can either do this as a Notebook that's maintained or using DataFlows.
  3. If you are using a Lakehouse with a medallion architecture, always create your gold layer as delta tables. This will ensure that these tables are added to your SQL endpoint automatically.
  4. Create yourself a service principle before you start. This will be used to ensure that the semantic model runs as the principle, meaning that it always knows the full context of the model. If we don't do this the model will run as the context of the calling EntraID and we may not be able to see objects we need to apply the DRLS. The rest of the blog assumes that you have the details for this service account before we begin.
  5. Don't apply any security to the objects in the SQL endpoint. Doing this will cause all PowerBI reports to drop back to DirectQuery mode. This is because the engine has to pass the query back to the SQL endpoint to evaluate the security context and so DirectLake mode can't be used.

For the purposes of this blog, I'm going to be working with a Lakehouse. For those working with warehouses, the process is exactly the same from the point that you click the new semantic model button on the report ribbon of the warehouse view.

Before we start looking at how we apply DRLS, one sanity point. You can't add DRLS to the default semantic model on a SQL endpoint. If you go into that semantic model what you will find is you don't have the options for security roles.


If you've done this, then I'm afraid you are going to have to completely rebuild your semantic model. In fact I would go one further and add the rule of thumb to never use the default semantic model, the disadvantages are too great (e.g. no DRLS, no calculation groups, etc). I wish Microsoft would remove it by default, but I haven't seen anything to say they will so far.

To create the semantic model needed, you need to go to your Lakehouse/SQL endpoint/Warehouse and click New Semantic Model on the Reporting ribbon.


At this point select the tables you want to include from the SQL endpoint and give your model a name before hitting continue. Once you've done this, establish your model, and add all your measures. Next, click on manage roles and setup your DRLS exactly as you would in PowerBI desktop:


Before we go any further go through all your tables and make sure they are running in DirectLake mode. How can we tell? You should see the icon below against the table name:


Assuming that this is all good still, we're on the home straight. The last step is to get your semantic model to run as your service principle. To do that we can follow the Microsoft Learn page explaining how to specify a fixed identify for a Direct Lake semantic model.

With that done we should now have a semantic model, with DRLS, and running in a Direct Lake mode.




Comments

Popular posts from this blog

Workspace topologies in Microsoft Fabric

The start of something new