Skip to main content
Search

    Search Microsoft Fabric Ideas

    Vote 3
    camilo castelblanco profile image

    camilo castelblanco on 5/10/2023 3:23:24 PM

    Power BI

    gpu acceleration

    Hello, everyone


    I don't know if it's my impression but I feel that with each new update of power bi desktop the memory consumption increases. I must confess that I don't have such a powerful machine, but the truth is that the idea is to make analytics accessible to all kinds of people and companies, and not just to large corporations with state-of-the-art computers.

    In that order of ideas, it would be excellent if power bi desktop had the ability to manage memory in a personalized way, in the style of a suite like Adobe (photoshop, illustrator...) where you can determine how much memory should be used at a time given, especially for the gpu. I have had moments in which the main memory can have a consumption greater than 90% and the gpu with only 4% use, it would be great if in moments of high demand for power BI the gpu came to the rescue contributing part of its capacity to help to the ram and not saturate the equipment.


    So specifically my idea is:


    that the processing load is distributed between the main memory and the gpu. Allow custom allocation of main memory and gpu for power bi processes.


    especially the microsoft mashup evaluation container process is extremely expensive

    New
    Vote 9
    Power BI User profile image

    Power BI User on 8/15/2019 11:51:29 PM

    Power BI

    Add Hardware Accelerated Rendering (GPU accelerated Analytics)

    With the advent of NVLINK from NVidia and how a few analytics groups are starting to leverage the GPU resources for database and analytics tasks - this would be useful to start reviewing/including in sql server as well as SSRS / Power BI (where applicable). WebGL is currently used in many tools that leverages client side GPU resources in the browser. this request is more for the server level. there's a few reasons for this. This makes a lot of sense in SOME cases : I. CPU and the system RAM currently in use is hitting a wall in performance improvements. a. The GPU on the other hand is easily 4x more powerful i. The RAM on a GPU tends to be MUCH MUCH faster than system RAM (albeit more expensive and smaller) ii. The CUDA cores in an nvidia GPU exist in the thousands. Compared to the 32-64 cores an intel Xeon cpu has . Granted they are not as robust but they are finely tuned for analytical data. So that’s thousands of computational cores compared to 64~ II. Speaking of the cores – use cases where the current testing is showing promise : a. Data structures that require aggregations or formulas. Anything geometric or calculation focused do VERY well on GPU silicon. i. Current tests with some of the tools I mentioned above are showing performance increases in the 60-100x. 40million records live queried and rendered in milliseconds. b. Any kind of spatial data. Maps in particular. Which you already touched on. GPUs are already built for handling geo-coding. By their very nature they plot and calculate distance and relationships very well any database using that gets to leverage that tech automatically. III. The further evidence that GPU cores are suitable for this type of calculations – in my view – is from the crypto-currency activity a few years ago. If you recall, GPU’s were (and still are) being used to do calculations for crypto. That right there is a clear example that they are well suited for any type of summation/calculation/aggregation type function. The main reason I mention this is because as we, and other enterprises scale out their PBI/SSRS/ analytics platforms it’d be cheaper to leverage GPU resources – if the tools could take advantage of the tech – instead of throwing more and more traditional CPU/RAM resources on the servers.

    Needs Votes
    Vote 7
    Eero Sipinen profile image

    Eero Sipinen on 2/19/2019 4:35:17 PM

    Power BI

    Gpu acceleration for desktop visuals / calculations.

    Hi all! Anyone familiar with heavy spreadsheets knows that Excel (And Power BI) calculation and load times can bog down any project that hasn't been optimized. I mostly do heavy calculation with R so i don´t really have that much issues with over complicated Excel spreadsheets. None the less I find it weird that Excel nor Power BI (That i know of) supports GPU calculations (In addition to multiple CPU-cores). I would love to have some way to speed up the calculations and visuals of the Power BI. (If devs could tell Excel devs to do the same for Excel calculations EVERYONE would be thrilled) If this is too daunting task at least add some specs for hardware scaling so I can recommend hardware scaling for IT. (I haven´t found any threads about hardware scaling)

    Needs Votes
    Vote 8
    Stefan Endl profile image

    Stefan Endl on 9/28/2023 6:02:30 AM

    Synapse

    MS Fabric GPU / Cuda support

    For deep learning tasks it would be great to have an easy way to use cuda devices in MS Fabric notebooks.

    Needs Votes