Ignite 2024 - What's been announced for Microsoft Fabric

With how popular my thoughts after Fabcon EU's keynote, I thought I'd do the same with the announcements out of Ignite.


Again, we follow the themes of AI everywhere- with announcements grouped into:

  1. AI-powered Data platform
  2. Open and AI-ready data Lake
  3. AI-enabled business users
  4. Mission-critical foundation
On top of these themes, we're seeing the number of items without CI/CD support being reduced. Long overdue, but at least it's now being resolved - and will be gone by the end of the year.

AI-powered data platform

This grouping is all about providing data teams with the tools they need to do their jobs.

SQL Databases

First up, we have SQL databases as a new Fabric component - basically a SaaS version of the Azure SQL DB that has been used for years. At face value, it seems like it's a small change but actually when you think about it, this looks to be a shift from Fabric being a purely analytical platform to an analytical and operational platform.

Especially when you think that a significant number of applications and websites are likely to be using Azure DBs behind the scenes. If you use this new Fabric object, with the GraphQL API, it means that we're closer to the stage of having one license for a lot of business use cases. It'll be interesting to see how much of the Azure stack gets consumed into Fabric in the future.

RTI going GA

It's great that RTI is now GA, and with CI/CD integration being added, it feels like Fabric is ready for production RTI loads. It'll be interesting to see if adoption increases, and what the roadmap for the future looks like.

Fabric data factory

The ability to import Data Factory pipelines is going to make it so much easier for those migrating existing setups to Fabric. Reducing the migration costs is bound to encourage more businesses to move from platforms like Synapse to Fabric.

Unsurprisingly we've also seen Copilot expanded into data pipelines. Great for US businesses, but until all Europe doesn't have to send data to Paris for Copilot (with the caveat it could be sent to the US), I can't see massive adoption growth this side of the pond. Microsoft need to get more resources in the individual countries for this to make a difference. It's all too high risk for a lot of businesses in the EU at the moment.

Fabric Data science

Saying that they need more servers, I think that AI functions is going to be great for driving up adoption once that issue is solved. Decreasing the setup and maintenance costs of frequent tasks is going to help businesses justify the investment.

But, the key is that these need to not be black boxes. Instead, Microsoft need to include detail on how these work behind the scenes - otherwise data scientists aren't going to want to adopt them, and they'll carry on building their own.

Fabric data engineering

On the notebook front, live versioning is definitely welcome. Whilst we have git hub support, I know I've had points that I've deleted something and want to go back to restore it but have yet to check-in changes. So the combination of the two should give us all the versioning we could want.

I'm really interested to get hands on with the ArcGIS integration, but at the moment the big unknown is how it's going to be licensed. Will it be free for Fabric users? Or will it require an ArcGIS license? I suspect the later, but both Microsoft and Esri need to make that clearer.

It's great to see the GraphQL API now being GA, but Microsoft need to release the details of the application testing that has been undertaken on this feature - especially if they want it to be used for operational platforms. The first question clients with have is who ran it? what standards was it run against? and more. At the moment I don't feel that's clear enough to be able to say "using it will save money on application testing before go live". Whilst penetration testing will always need to be done, we should have some third party validation when we have these powerful APIs that could open up an entire data estate. I get the argument that it's Microsoft we should trust them, but no one is infallible and with something this critical I just don't think it can be left to chance. It could be this is buried down in the Azure docs, but it isn't easy to find if it is - and that's part of the problem. Hopefully Microsoft include this info alongside November's release notes.

Power BI

Great to see TMDL (Tabular Model Definition Language) being integrated into Power BI desktop. It's going to be critical to have semantic models with proper Git integration and merge support. Definitely time to start learning it.

The big one for me is the Co-pilot generated report and page summaries in the email copy of Power BI subscriptions. I've seen first hand how good Co-pilot is at writing exec summaries. We're now at the point that this is the ideal use for LLMs to add business value. My one concern is that they are going to struggle to understand the why in context of the business - and that's what is needed to make it useful. For me, the next evolution of Co-pilot for Power BI is to be able to increase it's knowledge beyond what's in the semantic model. 

For example, it would be great to consume the business strategy, marketing plan, or third party data sets you don't have (e.g. weather, bank of England base rate, exchange rates, EU discussions articles, etc) to use as part of the grounding data without loads of integration effort. It'll be interesting to see how this develops, maybe they could lean on Snowflake to access data in the Snowflake marketplace.

Open and AI-ready data lake

Our attention next turns to OneLake. In this space we've only really got one feature. Whilst Open mirroring was also announced, I don't think it's going to be relevant for all - and will depend on the security restrictions in place at the moment.

OneLake Catalog

I'm a little baffled by this one. With all this new functionality now in Fabric, it removes a lot of value for paying for Purview. For me, some of the things in Purview like data quality aren't at the same standard as Purview's competitors. For me it feels like this feature is cannibalising Purview sales in the SME segment. 

One to keep an eye on, as either purview is going to need to mature rapidly, or this could be the first step on Microsoft giving up on Purview as a separate product (which they should do in my opinion).

AI-enabled business users

The expansion of AI skills to cover semantic models, Eventhouses, and multiple-sources is great. Combined with the native integration with Azure AI agent service and copilot studio means that it's going to be easier than ever to democratise data whilst lowering the technical entry point. 

For me, these have far more business value than co-pilots do today.

Mission-critical foundation

The big announcement in this space is workspace monitoring. Microsoft providing a monitoring solution out the box, that can be turned on at a flick of the switch, that can be easily extended, is a huge step forwards. It'll be interesting to see the number of accelerators for front end reports that spring up around this addition.

I said earlier that we needed to watch what came out for Purview, and sadly I think the protection policy integration misses the mark on this one. Given a lot of setups will be using Entra groups to manage access (if not, why not?), it's going to be a one and done thing that'll normally be managed at a Workspace level. I can't see this having much, if any, value for this feature alone. 

However, the saving grace is the integration of DLP (data loss prevention). This is probably the one reason today to consider Purview. The ability to stop personal data being uploaded into the wrong workspace is worth it alone if that's a risk your business carries. The only question is if it will stack up under the new licensing model - we'll have to wait until Jan '25 to answer that one.

Lastly, bit of an edge case for some, but the whole multi-tenant support is definitely welcome. I just hope it supports Entra B2B to make carrying Power BI licenses over for development by consultancies in a client environment a lot easier.


Billing and consumption updates

In something that could have gone un-noticed, the announcements around sharing Co-pilot capabilities is massive. It now means that if you have Co-pilot for production, you can stand up a separate, lower sku, capacity for dev/test and developers can still use Co-pilot to increase their productivity. But, again, without solving the whole EU availability issue, it isn't going to have the impact it should have.

Instead it solves one of the grumbles that I'd been hearing about needing multiple F64 sku's to make the most of Co-pilots. Hopefully Microsoft also listens to the other of not all organisations needing an F64, but still wanting access to Co-pilot.

Comments

  1. So with all the fabric announcements it's my understanding Copilot in Fabric will still require a F64 capcity. Is that correct? I was speculating that they would be announcing new options in how you could bring Copilot to a lower tier such as a F16.

    ReplyDelete
    Replies
    1. Sorry, I've only just seen this comment. Yes that was the case at the time, but have a look at this weeks announcements as things have significantly changed.

      Delete

Post a Comment

Popular posts from this blog

Workspace topologies in Microsoft Fabric

Power BI - Fabcon keynote, preview features, and March 2025 announcements