Posts

Showing posts with the label LogicApps

Cool hack to transform data in Azure logic apps, using simple expressions (and no code)

Image
  Amigos, here goes a very cool hack to handle an incoming payload in your Azure Logic Apps, that can create any type of complex transformation, using just expression and variables -- nothing else.  I have covered how to transform using Data-mappers on another previous blog (https://community.dynamics.com/blogs/post/?postid=d7eda435-952b-ef11-840a-6045bddad7eb), but this one is even easier. Check it out: Background:    We have an incoming purchase order payload that cumbersomely looks like this: {   "OrderNum": "PO000111",   "VendorCode": "v0001",   "NoOfLines": 3,   "Currency": "euro",   "PaymentTerms": "14d",   "OrderStatus": "Received",   "Invoiced": "No",   "Lines" :   [     {       "ItemId": "I0001",       "Qty": 12,       "Price": 1000,       "LineNum": 1     },     {       "ItemId":...

Logic apps vs Azure function: a short review

Image
  One of the toughest questions to decide for any integration project is whom you are going to decide; that too if both of them could give you equal possibilities. It's like choosing your favorite Avenger hero, when you could see they are battling each other ⚔ 🏹 πŸ‘ΊπŸ‘²πŸ‘¨   You could get a lot of suggestions across the net, here goes my list of observations, when to choose what: πŸ“Œ  ALA and AFs are both trigger-based phenomena, however when you see you have a lot of connectors that you need to use in your ALA, which definitely are not free, (and give you an alluring offer of first 3 months' of subscription as free) -- I think you should choose Azure functions. Let me give you one example: I had a requirement, where I needed to read a PDF, as soon as it arrived a blob storage, then do some manipulations to the file and then move it to a shared folder drive, once done. Here, if we choose an ALA, then we need to use a lot of connectors that are readily available, so as to proc...

When your Azure blob trigger based Workflow is not triggering

Image
  What could be more irritating when you see your Azure blog trigger isn't getting fired, even though files are getting dumped into your azure blob incessantly? What could be the reason? Well, the answer is very simple: Maximum trackable number of items inside of a virtual folder for trigger = 30000 Which means, if your Azure blob doesn't have a purging facility associated with it, eventually the container will reach to such a threshhold, that your workflow will simply fail to detect any blob, when it gets added. Solution: a. Either you design your blob storage to be limited to around 1000-2000 files, then it will be really, really great. b. Or else you could enable auto-archiving of your azure blob storages, with a certain frequency, as is documented in the following MS link: https://learn.microsoft.com/en-us/azure/storage/blobs/archive-blob?tabs=azure-portal

Integration of D365 using Azure API and logic apps: part 2

Image
I hope you all are doing great and might have gone through my earlier blog on Azure APIm introduction: https://www.blogger.com/blog/post/edit/2520883737187850604/4308148657431504181 This documentation focuses through live examples as to how such a system could be effectively designed and made to work with D365. Situation: problem statement We have a 3 rd party logistics system who creates Purchase orders in DAX, validates and creates GRN in AX and subsequently posts them. Solution design a.        A batch job that creates an Azure storage file in a shared Azure folder. b.        Custom Ax services pick up the file and creates Purchase orders in D365 c.        The third party receives the purchase order in their own system and tries to replicate the same in Ax. For that I.                   ...

Table as a service: integration designing with D365F&O -- a walkthrough

Image
  What is TAAS Table as a service is a very convenient way of high frequency, high volume data exchange, operations, without the need of worrying about the supporting architecture, speed and efficiency. I can make my data be stored and arranged as per any need, without sticking to a fixed metadata, beforehand – as compared to conventional DBMS definition. Your Table exists as an API, whereby you can call the API and enforcing PUT, POST and delete actions thereby.   Table as a service as a storage Azure table storage exists as a patterned NoSql data in Azure, resulting in a schema-less design. Table storage generally comprises of following components: Accounts: a subscription account that connects all the storage offerings(Blobs/files, queues, containers, tables) Tables: a table can comprise of several entities Entities: an entity is like a row that comprises of several properties Properties: Key-value pair Look at the following example, that comes from ...