RE:
If this is considered then it should be included in a closing checklist which is also a very common requirement
RE:
This is not user friendly indeed to leave the currency code blank for local currency transactions.It is a very regular request to get the currency code displayed no matter if it's local currency or foreign. Especially for companies working in multiple currencies across different companies, the fact to leave the currency code blank can be confusing for reporting/BI purposes.
RE:
This is a common Auditor request to have a back dated report.The problem stems from the Invoice Register and Approval journal function that is commonly used by ISV like ExFlow.
RE:
You can use transfer orders to process inventory transfers between warehouses. In India, if the shipping and receiving branches of the organization have different tax registration numbers, the India Goods and Services Tax (GST) should be calculated and posted for the transfer order. The tax base may be defined as the current cost price of the item being transferred or a special transfer price. The tax amount should be posted as GST payable for the transfer shipment and as GST recoverable upon the transfer receipt. An Interim transit account is used as an offset account for the posting and is nullified when the transfer order is fully received.The Stock transfer functionality that is available for India supports this process.
RE:
It's been a year! Any progress?
RE:
I second, third, fourth all the complaints in this idea post. I have a whole infrastructure with column names that contain special characters like spaces or underscores that are already present that are distinct from spaces, and we have to rename all of our columns because Fabric did not enable this simple feature that already works in Databricks.
RE:
Please provide support to load a password protected excel file to power BI
RE:
It has been 8 years since this topic was brought up, how hard of a fix would this really be?
RE:
Please either allow through Sql Connection stringimport com.microsoft.spark.sqlanalyticsfrom com.microsoft.spark.sqlanalytics.Constants import Constantsdf.write\ .option(Constants.SERVER, "")\ .mode("overwrite")\ .synapsesql("")or directly using saveAsTable methoddf.write.format("delta").option('delta.columnMapping.mode' , 'name').mode("overwrite")\.saveAsTable("TABLE_NAME", path="abfss://WORKSPACE_NAME@onelake.dfs.fabric.microsoft.com/WAREHOUSE_NAME/dbo/Tables/TABLE_NAME")
RE:
Even the internal delta tables created using Fabric notebooks in lakehouse does not appear in the sql endpointdf.write.format("delta").option('delta.columnMapping.mode' , 'name').saveAsTable("TABLE_NAME")Can you please share an ETA, when can we expect the feature to be available. This pose severe limitations in integration of data from different sources