-
Notifications
You must be signed in to change notification settings - Fork 144
Frequently Asked Questions (FAQ)
Semantic Link + Semantic Link Labs unite an unprecedented amount of capabilities for Power BI / Fabric admins & developers into a single tool. Furthermore, these capabilities can be harnessed in unison, providing literally endless possibilities. Here is a list of some of the primary features available. Much of the library focuses on semantic models and reports but there are quite a few features available for other Fabric items as well. For a more comprehensive list of scenarios, see the Featured Scenarios.
- Model Best Practice Analyzer
- View & edit model metadata via the Tabular Object Model (TOM)
- Vertipaq Analyzer
- Report Best Practice Analyzer
- View & edit report metadata via the ReportWrapper
- Run DAX, DMV, XMLA, or TMSL against a semantic model
- Run traces against semantic models
- Easy access to Power BI, Fabric, Azure, Microsoft Graph APIs
- Workspace Monitoring
- Access to AI (i.e. translations) and other python libraries
Semantic Link Labs is intended to be used in conjunction with Semantic Link. Semantic Link Labs does not replace Semantic Link and the functions in Semantic Link Labs are net new to what exists in Semantic Link (at least at the time they were built). The goal is for Semantic Link Labs to serve as a constant 'public preview' and that when ready, functions will move into Semantic Link. Therefore, you may find functions exist in both libraries but that is likely because the functions have been assimilated from Semantic Link Labs to Semantic Link. Semantic Link’s functions can be found here.
Absolutely not. In fact, this library was designed to make notebooks more approachable for not-so-technical folks. If you see the code examples, you will find that much of the functionality in Semantic Link Labs can be used without really knowing much about Python. Simply enter the parameter values into the function, run the notebook and view the results. Simply plug and chug! Naturally, if you are more adept at Python you can leverage this library to do even more but it is not a requirement for getting value from Semantic Link Labs.
Semantic Link Labs may be used in either a PySpark or Python notebook in Microsoft Fabric. Generally speaking, most functions can be executed using a Python notebook. Using a Python notebook will generally start up faster and be a better experience (and cheaper). Some functions (and using some parameters within some functions) necessitate Spark which necessitate a PySpark notebook. For example, run_model_bpa can be executed in a Python notebook. However, if you set the 'export' parameter to True, it uses Spark. Hence, in that case you need to run the function in a PySpark notebook. Work is being done to increase the percentage of functionality of Semantic Link Labs which can be run in a Python notebook (while keeping all existing functionality). As of version 0.9.3, a friendly error message will show if you attempt to use a function which requires a PySpark notebook in a Python notebook.
Yes. Semantic Link Labs leverages various Microsoft Fabric, Power BI, Azure, and Microsoft Graph APIs. Note that not all APIs from each of these structures are in Semantic Link Labs - only the ones which are most relevant for the consumers of Semantic Link Labs. If you have a request for a new API to be added to Semantic Link Labs, make an enhancement request.
Although it may seem like the answer could be yes, the answer is no. However, Semantic Link Labs handles authentication, long-running operations, pagination and other intricacies within APIs that are not so fun to handle on your own. This makes these APIs much more user-friendly and accessible even to non-technical folks.
Additionally, Semantic Link Labs has functions which integrate multiple APIs within a single function to create a better user experience. For example, the scan_workspaces function calls 3 separate APIs behind the scenes to get you your result in a much simpler fashion than if you coded it on your own. And, there are other functions in Semantic Link Labs which make multiple calls to get more information out of a single function call than if it were done by calling the API in a standard manner.
Why is the create_pqt_file function returning more than 1 file?
Dataflows Gen2 has a limit of 50 tables. If there are more than 50 objects which use Power Query (tables or expressions), this will save multiple Power Query Template files (with each file having a max of 50).
As of version 0.12.0, service principal authentication is supported for all functions which use APIs which support service principal authentication. The service_principal_authentication context manager handles this for functions in both Semantic Link and Semantic Link Labs. Any function put inside of the service_principal_authentication context manager will authenticate via service principal if it is supported. See here for examples.
Use %pip install. Using !pip install will cause various issues and should not be used in the context of a Fabric notebook. The only scenario where using !pip install is valid is when calling a notebook from a pipeline.
The easiest way is to view the stable version of readthedocs. If you view the 'latest' version it may show functions which have been developed but have not been released to production.
When viewing the source code, the 'main' branch may show functions which have been developed (just like the 'latest' version of readthedocs). To view the source code of a particular version of Semantic Link Labs, select from desired tag version from the 'branches/tags'. As an example, this link shows the source code as of version 0.9.10.