Posts

Showing posts with the label Azure

Tired of Azure Functions ? Switch to Azure Data Mapper for Logic Apps transforms

Image
  Azure data mapper is a very convincing way of transforming your data schemas, without involving little or no requirement for coding, and a easy to use: a. Drag and drop facilities b. A rich built-in library contaning a good number of assisting functions  c. You can manually test the mapping/preview the data. d. Ability to Debug e. Ability to use them on API Management services and Logic Apps very easily f. An overview pane to review if your mappings are correct. A lot of emphasis have been laid on transforming myriad file types: a. XML to XML b. JSON to JSON c. XML to JSON and vice versa However, yes, there are limitations too. As of now: Data Mapper currently works only in Visual Studio Code running on Windows operating systems . Data Mapper is currently available only in Visual Studio Code, not the Azure portal, and only from within Standard logic app projects, not Consumption logic app projects. Data Mapper currently doesn't support comma-separated values (.csv) files. ...

Cool hack to transform data in Azure logic apps, using simple expressions (and no code)

Image
  Amigos, here goes a very cool hack to handle an incoming payload in your Azure Logic Apps, that can create any type of complex transformation, using just expression and variables -- nothing else.  I have covered how to transform using Data-mappers on another previous blog (https://community.dynamics.com/blogs/post/?postid=d7eda435-952b-ef11-840a-6045bddad7eb), but this one is even easier. Check it out: Background:    We have an incoming purchase order payload that cumbersomely looks like this: {   "OrderNum": "PO000111",   "VendorCode": "v0001",   "NoOfLines": 3,   "Currency": "euro",   "PaymentTerms": "14d",   "OrderStatus": "Received",   "Invoiced": "No",   "Lines" :   [     {       "ItemId": "I0001",       "Qty": 12,       "Price": 1000,       "LineNum": 1     },     {       "ItemId":...

Bring data to Microsoft Fabric Onelake from D365F&O (no Azure SQL needed)

Image
  Hi freinds, in one of recent blogs, I chalked out an integration stratgey pattern of setting up of D365F&O with Microsoft Fabric, by routing it through a DMF batch job to Azure SQL. This currrent blog talks about even a cool hack of direct integration between D365F&O and Microsoft Fabric, just harnessing Synapse Link itself -- no additional/intermeidary steps are required. Here are the steps: Step 1:  Create a Datalake Storage account in Azure Portal, with the following settings:  Don't forget to mark the following setting, from the advanced tab: Click Review + create >> Create to complete the wizard to create the Storage account. Step 2:  Come to PowerApps page (https://make.powerapps.com/environments), and check if you are able to view the  Azure Synapse Link: If not, then you have to click on More >> Discover all >> You should be able to see it as under the Data management: Ensure that you are on the correct environment: Step 3: On...