Tag: Administration

  • Overcoming Challenges in the Center of Excellence

    Overcoming Challenges in the Center of Excellence

    Starting a center of excellence (COE) can feel daunting. We face political challenges. This article pushes to explore the challenges of a COE and some recommendations to handle these challenges.

    The Importance of Attention to Detail

    Microsoft does a great job in outlining the key aspects of COE. For more details on this topic check out the Fabric adoption roadmap found here. A summary of those items are in the list below:

    I strongly feel that documenting the result of these conversations is a huge win. The documentation can be used to show leadership that you have a solid plan. Discussing these topics pushes towards a health data culture. Lastly, when you bring documentation to leadership you show thought of aspects that drive success.

    Foundational Attributes for Success

    The optics of the COE matter. COE performance and leadership are crucial, as they can impact the entire organization. Don’t underestimate the value of setting clear goals. Taking time to identify pain points with your current organization structure help with planning process for the COE.

    • Setting clear goals
    • Addressing pain points that you see, plan to solve those pain points
    • Just start, don’t worry about making the COE perfect, plan for adjustments

    Sometimes I feel that people try to over plan. Therefore, read up on the best practices provided by Microsoft’s documentation, write down your decisions then get moving! I have observed just by communicating and developing the plan really creates some momentum. Bear in mind it won’t be perfect in the first iteration. Plan on being flexible to adjust the COE to the organizations needs.

    Recommendations for Overcoming Challenges

    • Attention to Detail: Paying attention to aspect you can control of the COE’s performance. Engage leadership so they support the development of the COE. Remember the COE is a vote in the direction of better data culture for your company.
    • Setting Clear Goals: Defining clear goals helps the team align towards a unified direction. Address pain points that could derail or distract from the creation of the COE. Connect the success of the COE to Objectives and Key Results (OKRs) outlined by the leadership team.
    • Regular Communication with Executives: Regular communication with the executive team helps remove mis-aligned expectations. When you win let leadership know, they can promote your success. Success means more buy-in from the company.
    • Feedback: Gathering feedback and pivot. Have empath for the process and be willing to adjust. If something is not working within the COE try something new. Ask others involved in the COE for recommendations, some of the smartest people are the ones you already work with.

    For more thoughts on the COE and overcoming those challenges check out our episode on the explicit measures podcast.

  • Does a Power BI developer need Python?

    Does a Power BI developer need Python?

    In this episode of the Explicit Measures Podcast, we discuss the values of adding Python knowledge. It is well known that more organizations are moving to cloud platforms. Because of this, we should ask the following questions:

    • What will be the needed skills for Power BI professionals?
    • Is this for the Business or for a Data Engineer?

    We invite you to jump in and listen to the podcast as we discuss these topics. Enjoy!

    YouTube Explicit Measures Podcast

    Listen on the go

    Don’t want to watch the video, no problem we are on all major podcast platforms.
    Spotify
    Apple Podcasts
    iHeart Radio

    Ask Chat GPT

    As in all our episodes we like to end with what does chat GPT say. Thus, our question for this episode is

    Should I learn Python for Power BI development?

    Chat GPT response

    The response was quite interesting, discussing the various factors as to why Python is valuable. Overall, we gave chat GPT a better than average score of about 6.5 out of 7.

    Learning Python can be beneficial for Power BI development. Python is a versatile programming language that is widely used in data analysis, data manipulation, and data visualization tasks. Power BI has its own built-in capabilities for data processing and visualization. Python can extend its functionality and provide additional flexibility.

    Here are a few reasons why learning Python for Power BI development can be advantageous:

    Data Manipulation

    Python has a rich ecosystem of libraries such as Pandas, NumPy, and scikit-learn. These libraries are excellent for data manipulation, cleaning, and transformation. Use Python to preprocess and shape your data before loading it into Power BI.

    Advanced Analytics

    Python offers a range of libraries for advanced analytics and machine learning, such as TensorFlow, Keras, and PyTorch. Integrating Python code into Power BI, you can leverage these libraries to perform complex calculations. Additional features could include predictive modeling, or create custom machine learning models to enhance your data analysis capabilities.

    Custom Visualizations

    Power BI provides a variety of built-in visualizations. But, if you have specific visualization requirements or want to create custom charts. Python can be a powerful tool for visual development. Libraries like Matplotlib and Seaborn offer extensive options for data visualization. Enabling you to generate interactive and customized charts and graphs to present your insights effectively.

    Automation and Integration

    Python can help automate repetitive tasks and streamline your Power BI workflows. You can write Python scripts to extract data from different sources, transform it, and load it into Power BI automatically. Python’s integration capabilities enables connections with other tools and platforms, such as web APIs, databases, or cloud services. Python expands the scope of your data analysis.

    Community Support

    Python has a large and active community, with abundant resources, tutorials, and forums available. Learning Python can provide you with access to a wealth of knowledge and expertise. A Large community makes it easier to find solutions to specific challenges.

    In conclusion, while it is not necessary to learn Python for Power BI development. It can certainly enhance your capabilities and offer more flexibility in data manipulation, advanced analytics, custom visualizations, and automation. Consider your specific requirements and goals for a project. Your project requirements may determine whether investing time in learning Python would be beneficial.

    Chat GPT – May 12 Version

    Not bad Chat GPT. I would disagree with a couple points made on how easy it is to integrate python into Power BI. Two options I am aware of today using Python in Power Query.

    Option 1 – Using Python in Power Query

    Using Python in Power Query from Microsoft Learn

    Option 2 – Using Python for Visuals

    Using Python to develop a Visual

    I feel that more of python is used in cloud services. This was a major point brought by Seth in the podcast. Very valid point.

    Overall, we hope you enjoyed the podcast! See you next time.

  • Power BI is part of the greater data solution

    Power BI is part of the greater data solution

    Power BI is a powerful reporting tool that has been dominating the market and rapidly evolving. Yet, in many organizations people seem unaware of its true potential or core purpose. As a result, too often it is deployed to simply extract or visualize data points in an ad hoc reporting manner.

    Power BI is greater than a report

    Power BI should not be thought of as a separate product to ETL, AI/ML or overall data strategy. Rather, organizations need to include it as part of a data culture with all of the products working in union.

    To deploy Power BI successfully, do not use it to simply design reports. Instead, design a culture and architecture. This is one that allows business users to understand, interpret and react to rich and powerful data driven insights.

    The many additional products, services and capabilities that come packaged in Power BI are too frequently overlooked. As a result, people see only the top level – visuals in reports and dashboards. But there is a whole host of rich and exciting features below the surface.

    With that, here are some common mistakes I have frequently seen new users make when rolling out Power BI.

    Mistakes made to under utilize Power BI

    • Using it for Data extraction
      Large tables with a selection of filters that you may or may not look to export. Instead, Power BI is designed for trends, insights and cross slice and dice. Large tables and data dumps do not give insight.
    • Using it for a data visualization to tell a single point
      Design a visual that can convey information quickly, rather than an infographic type solution. If you are looking for that pixel perfect data visualization for a news story that tells a specific point, there may be other options. Paginated reports or predesigned Excel documents are viable options. Design data pipelines that are regularly updated. Create visuals that are designed to be interactive. This will help users drill down and find insights.
    • Ad hoc only reporting
      While this can be a great tool for ad hoc reports, you may be underutilizing and doing extra work. Instead, build reusable data models that are designed for multiple reports. Write DAX business logic and KPI that can serve as a single source of truth. Be sure to document your measures inside the data models. By clearly documenting measures data consumers will understand how to use the data model to build new reports.
    • Current reporting tool / Excel replacement
      A common request is to “lift and shift” all excel reporting into Power BI. These products are different and have different uses. If you are moving to Power BI, don’t try and recreate old solutions. Instead, a better approach is to design new reports that play to Power BI’s strengths. Utilize the rich features and powerful engines that make Power BI beneficial. This is a story of it’s better together. Using just Power BI or just Excel has it’s advantages and dis-advantages. Conversely, using both Power BI and Excel can play to each tool’s strength.
    • Not building a data culture
      Matthew Roche has an amazing blog series on building a data culture with why and how to do this. Building a good data culture is vital for adoption within the organization. The data culture will start with an Executive sponsor who can push for adoption. So, first and foremost, be sure to have a leader who believes in your vision.

    Mistakes made when deploying Power BI solutions

    • Focusing on raw numbers, not business insights
      Instead of simply displaying numbers, great reports often have the following KPI, trends, drill down, interactivity and slicing capabilities. This allows business users to gain meaning information about the direction for the business.
    • Ignoring the deployment approaches
      Many business users are familiar with a typical process for reports; a user submits a ticket to IT. IT writes a bunch of SQL queries to get the data for this request. They then surface the data in tables and simple graphs. In contrast, Power BI does a great job at breaking down this long turnaround and getting the data in users hands quick. An organization should deploy a top-down, blended or bottom-up approach. As a result of utilizing this approach, they can merge the business and IT side of operations and remove silos.
    • Failing to Think like the Business and Act Like I.T.
      The I.T. organization has many strengths related to how to make data available quick and reliably. Power BI is mainly designed for business users. Thus, Power BI has features that borrow from best practices from I.T. One such best practice is the use of Deployment Pipelines.
    • Not utilizing Data Models or ignoring self-service reporting
      Data models, as described in this blog by Matt Allington, contain all the metadata needed for reporting. This includes the business logic and data transformations. However, creating and maintaining these can be time consuming. Instead, it is possible to reuse data models and keep one source of the truth for many reports. The modeling experts can own and maintain the models. Furthermore, business users can connect and build their own Power BI reports utilizing the models. This is done without even needing to write a single line of code.
    • Treating Power BI as a stand alone product, not part of the greater data or AI solution
      You should not treat Power BI should as just a visualization tool (read this blog by Gil Raviv). Instead, Power BI is a business insights tool, a way to serve and communicate the information within the organization. In addition ML and predictive analytics are baked into it, as are ETL processes, data storage and security. As a result a unified approach to a data culture should be built. Users from all business areas need to be aware of the strategy.

    Using Power BI the right way

    Power BI should be unified and part of the entire data stage – not a visualization layer on top of it. A modern data platform typically has 4 steps:

    • Load and Ingest – extract the data out of the source system and transform it.
    • Store – Land this data somewhere so we can run analysis on it.
    • Process (or transform) – Run analytics on your data and draw out KPIs, AI and predictions.
    • Serve – present this data in an easily way for stakeholders to consume it.

    Power BI can be all of these steps. From a single report using power query (Load and Ingest) to import data (Store). Next, you can build a model and DAX measures (Process). Lastly, you can surface the data in visuals on the report pages (Serve).

    This can be a more enterprise level solution and scale well too. Firstly, Dataflows are set to extract and transform data from many sources (Load and Ingest). You can back-up and store in a data lake gen 2 storage (Store). Secondly, the data can take advantage of automated ML (AutoML) and cognitive services. Build DAX expression over them, combining a powerful DAX language with the power of AI (Process). Last, you can package these as reports, dashboards, apps or embedded into other applications (Serve).

    Alternatively, Power BI doesn’t have to be all these steps. A traditional data platform architecture is described by Microsoft in the picture below. You can utilize other tools such as Data Factory to Load and Ingest data. Next, you can use Databricks to Process/Transform the data. Power BI and Analysis services models will serve the data to the end user.
    This is a great example of Power BI fitting into a greater data solution. However, you should implement the deployment with the entire solution in mind. Power BI is not as a tool for simply creating visuals. A good deployment is deeply rooted in the culture. Each step must consider the others in the pipeline, not sit in silos.

    Source: Microsoft

    Bonus: See this great diagram by Melissa Coates, showing Power BI end to end features.

    Azure Synapse

    Microsoft is expanding this ecosystem with Azure Synapse. As they roll it out, they are designing data engineering as a single platform. This combines this entire pipeline and tools into a unified experience. Power BI being a part of this platform.

    Source: Microsoft

    Synapse provides Consistent Security

    When we think about user level security, Azure Active Directory (AAD) is the gold standard for access and security for organizations. Synapse leverages this technology to remove friction between different azure components. You can leverage AAD across the multiple services for data factory, Data Lakes, SQL and Spark compute as well as Power BI.
    The experience of governing data on a user by user basis improves with the Synapse experience.

    A Low Code Data Engineering Solution

    There are many Azure components you can use to produce a well engineered data pipeline. Azure Synapse brings all these tools under the same portal experience. For example, using Azure Data Factory, then writing data into a data lake. Picking up the data and querying flat files with compute engines such as SQL or Spark. Azure Data Factory also has built in features that can simplify data lake creation and management using mapping dataflows.

    More Computing Options

    No longer do We have to choose just SQL or Spark, rather We have options. We can use Provisioned SQL which was previously Azure Data Warehouse. Synapse now offers on-demand SQL, and Spark compute engines. This is where we are really seeing the technology move to where we have separated the storage layer from the compute layer. This means Azure Data Lake Gen2 serves as storage, and SQL and Spark serve as compute.

    One Place for all information

    Whether it is Azure Data Factory, Spark, SQL or Power BI. Synapse has now become the single portal for integrating all these services. This in general simplifies the experience and management of all your data pipelines.

    If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Buy and Apply Power BI Premium P license

    Buy and Apply Power BI Premium P license

    I am working on a project that uses Power BI embedded to display reports to external users via an application. I’ve used the progression of A sku’s (embedded license via Azure) to support the various reports. I love using the A sku for various reasons, it has a low point of entry offered in the A1. It is easy to scale up to the higher tiers as I need to. It has the ability to pause a capacity at any time. I also enjoy the flexibility of pay-by-the-hour the license provides. However, I just got to the point where one of our capacities is about to exceed the 10GB of RAM I get on the A3. As a result, I started to compare the A4 sku to the P1 sku. They are the same in terms of cores and RAM (8/25), but the P1 has an option to be cheaper.

    After researching how to buy and apply the Premium P license I realized there wasn’t an end to end explanation of what to expect and how to apply the P sku to my specific region. This is hugely important in order for some of the Service features to work correctly. When committing to large sums of money, I find its always nice to have these answers up front, so I hope the following walk through helps those decision makers out there.

    Analyze the A sku

    Before we jump into the P sku, lets take a quick moment to see how an A sku is purchased in Azure. There is documentation out there that explains how to sign up for Premium and includes the A sku in Azure. That can be found here ( https://docs.microsoft.com/en-us/power-bi/admin/service-admin-premium-purchase). However, I want to highlight the two areas that most interest me, that I couldn’t find answers too when trying to commit to buying the P sku. There are two key areas that I care about the most with this purchase. The first, is the location of the capacity (region). Second is who gets assigned as a capacity administrator. When you purchase the A sku, those are front and center. As a result, the license purchase is an easy process because I select them prior to committing any money.

    Purchase the P sku

    Unlike the A sku, you purchase the P sku in the Office 365 admin center. The glaring difference from the experience with the A sku is that you purchase the P1 license without any configuration… This can cause a bit of heartburn if you need to ensure that the capacity is applied to the right region upon purchase. For the moment, you can just assume things will come out smelling like roses and move on to the steps to purchase the P1 license. In the O365 Admin Center under Billing you will Select Purchase Services and then Search for Power BI. This pulls up the list of licenses you can choose from and you are going to Select the Power BI Premium P license.

    Selecting the license presents you with the options for payment type.

    Here is where we see the much cheaper price of $4,995.00, but it comes with the yearly commitment. (As a side note, I really wish we had the yearly commitment option with the A sku, with that option available, I wouldn’t even have to muck around with the P sku for my implementation.)

    After you complete the purchase process you can navigate to Purchase Services again and see that the Power BI Premium P license is now active.

    Assign and Configure on Setup

    Now what?

    Well, all you saavy Power BI Admins, we head over to the Power BI Service of course!

    Log in to your Service (app.powerbi.com) and because you’re all Global Administrators or Power BI Administrators you have access to the Admin Portal. For you first timers, that would be under

    Settings > Admin Portal

    (If you do not see the admin portal you will need to contact your IT or Security guys to grant you the Power BI Administrator Role.)

    Now normally when you go into the portal you would see a page that looks like this under Capacity settings.

    But after you purchase the license, in O365, and come back to the Power BI Service you will see this the first time you log in.

    Click on the Set up new capacity button and you get to the screen that myself and all of you wanted to see from the start. Where you add any additional capacity administrators as well as which region you want to use for your capacity.

    As the gray box outlines for you, the initial region is the home region of the Power BI tenant, but Clicking on the Region pops up all the other regions you can choose from.

    Make your selections. All that remains to be done is Clicking on the Set up button. Now your capacity is provisioned based on your configuration settings.

    Your new view when you log in to work with your capacity looks like this. You’ll be back often to monitor and alter any capacity configurations.

    Apply Capacity to Workspace

    I’ll close this out by showing you the final step of how you apply that new capacity to your workspaces.

    Jump back out into the home screen by Clicking Power BI in the upper left hand corner of the browser.

    Click on Workspaces and hover over the workspace that you want to add to the capacity.

    Click on the ellipses (3 dots) that appear to the far right and Select Workspace settings.

    The Settings dialogue will appear on the right hand side and you will Click on Premium in the header. Give the application a moment and you will be able to toggle the Dedicated capacity to On

    In the dialogue, Select the newly provisioned capacity and Click on Save.

    You will now see a little diamond next to your workspace name.

    After all that, you now have a Premium P license capacity supporting the datasets and reports in that workspace.

    Wrap Up

    I had too many unresolved questions during this process. I was looking for something like this blog to assure me I was headed in the right direction. Since I didn’t find it, I decided to write up my experience. I want to make sure others with those same questions can to see what it looks like to go through the process from an end to end. Hopefully this perspective helps when making the leap into Power BI Premium P1 licensing.

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Milwaukee Brew City PUG – April 2020

    Milwaukee Brew City PUG – April 2020

    The Milwaukee Brew City PUG for April kicks off with some quick updates and highlights of upcoming events. We spend a quick minute on why we’re so excited about the fantastic line up of webinars that we were able to do recently. They highlight the top 3rd party tools to get the most our of your Power BI experience. We were excited to welcome a brand new Microsoft MVP – Chris Wagner to speak to us in this April PUG. He walks through How to Build a World Class Center of Excellence for Power BI. We also had some great conversation around a myriad of topics at the latter half of the meeting, so be sure to stick around to catch some of that at the end.

    Enjoy the Meeting!

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise