drigloff mefistoguru on 03 Feb 2018 03:08:00
AI for assessing the quality of the query and suggestions for its optimization before executing the query on a full set of data.
Power Query is positioned as a tool that is available to non-professionals. However, this tool in the hands of not qualitatively prepared users can lead to major business mistakes.
During the development of the request, we make mistakes and inaccuracies, due to which:
- Unused requests can be superfluous;
- Steps can be superfluous (sorting, type conversion, adding steps that are not used in the future, etc.);
- Separate data may be superfluous and stored throughout the entire query, and only at the end of the retire.
- when the tables are merged, the connection may be incorrectly established (without covering a part of the data or overly duplicating them);
- in the steps menu it is not obvious what changes occur with the data at each step.
We need a system that, based on the current query, the source data and the end result (in the preview), will offer an optimized code, or options for optimizing individual blocks with an explanation.
Also, to improve performance, we need hints at each step about how much time it takes to complete the next step on a full set of data. This time needs to be shown in the step list - the step time and the cumulative time from the start to the step. Having such a tool, we can compare the speed of query variants without starting a long processing of the entire array.
To the right of the merge menu, you need to display detailed information on unique records of the fields selected in the connection - which ones are in the sample, which are not covered. To be able to select an item that does not have a match and see the rest of the table elements on the bottom of the screen.
To track changes, we need a panel (the new one at the top will just take the right place), reflecting visually the Data WorkFlow. Now this makes Tableau in the Maestro project, and EasyMorph.