RE:
Agreed. The literal hours lost changing the format for each individual date field is a nightmare. It should not be so hard to set preferences to the file that all date fields should have the same DESIRED format by default.
RE:
working with PII data in Microsoft Fabric Lakehouse and need to implement on-the-fly decryption for Power BI reports. current challenges are, I have encrypted data stored in Fabric Lakehouse tables.I need to decrypt this data dynamically when it's accessed by Power BI reports.I attempted to create a permanent view using a temporary decryption function, but encountered an errorin above scenario udf are very useful
RE:
When is this coming?????????
RE:
I think this is a great idea. Today, workspace access is required in order to connect to a dataflow as a consumer. I wish we could share the dataflow directly with Security Groups and Users, so they could consume the dataflow data in their own semantic models. It would increase the reusability.
RE:
I have always lamented the inability to store documentation with a report. Now with Power BI I can create a 'developer notes' page and hide it. Invisible to the user, visible to the developer.To that end, if I insert a Text Box into such a page, I want to be able to edit with with the same facility I get using Word or OneNote.Let me paste! Images, text, etc.Give me keyboard shortcuts to format, indent, bullet, insert a hyperlink, etc.
RE:
When manually navigating to an individual run/artifact, either from the Monitor list or the "View run history" button directly on a data pipeline, the URL generated in my browser's address bar uses the actual name of the pipeline rather than the GUID-ish pipeline ID. When attempting to share an individual run/artifact with a coworker, or generating a hyperlink in a Teams or Outlook message activity within the pipeline itself, the GUID has to be substituted in for the name, or else the URL does not actually work. It's not a critical issue, but weird and mildly inconvenient.
RE:
Hello, if all the activities added to the pipeline are not connected to each other, that is, if they all run in parallel, it places all these activities down in a line. After changing the locations of the activities as we want to see them and saving them, when we open the pipeline again, we see that the activities are automatically "Auto Aligned". All the activities are lined up down like a rope again. There is a bug here. I am writing because the Microsoft Support team directed me here. I should be able to place the activities in the pipeline wherever I want on the screen.
RE:
Hello,My idea is a small feature request:Within spark job definitions it's possible to pass command line arguments to the spark job. I would like the possible to call my spark job definition from a pipeline activity and be able to pass command line arguments.This arguments field in the spark job definition activity should be able to contain dynamic content.This makes spark jobs more flexible and creates the possibility to make them metadata driven from a pipeline.Kind regards,Martijn