Notifications from Workflows

Dear community,

I would like to create automatic notification / action for specific purposes. Mainly the following:

  1. Notification once a workflw completed / failed (i. e. per mail or Webex Teams)

  2. Send result of a workflow as csv per mail

How can I do so?
Are there scripts that I can re-use for this purpose?

Especially for 1. notification, can it somehow be set to notify after workflow completion in one project as a whole or do i need to integrate it in each workflow?

Thanks!

Part 1 is definitely possible. In our current project, we implemented this by using Outlook Webhooks, which can be triggered using a Python processor and then send nicely formatted messages to a mail address or Outlook group. An exemplary Python code for this would look like:

import requests
...

proxyDict = {'https': '<PROXY-URL>'}
webhook_url = "https://<WEBHOOK-URL>"

json_payload = {
    "@type": "MessageCard",
    "@context": "http://schema.org/extensions",
    "themeColor": "0076D7",
    "summary": "Write a short summary here",
    "sections": [
        {
            "activityTitle": "Titel of this message",
            "facts": [
                {
                    "name": "Example Text",
                    "value": "failed/success"
                }
            ]
        }
    ]
}
requests.post(url=webhook_url, proxies = proxyDict, json=json_payload)

The Python processor with webhook can sit directly in each workflow you want to monitor. Alternatively (and answering your last question), you can also put it into a “monitoring workflow” that collects the status of a list of workflows via the ONE DATA API.

1 Like

Thanks for the answer!

Would you say this is a feasible approach to use in workflows that are not constantly used?
In our case an analyst would set up a workflow run it once and use the results, then he would start a new task and therefore build a new workflow. This means he or she would have to include the script in all workflows. Would you say this is still useful or does it rather create overhead?

The outcome we would expect to get is only “worfklow finished” and status “sucessful” or “failed”. If we have a monitoring workflow that collects this information from the API it would have to be run by a scheduler every few seconds, but the difficulty would be to stop the scheduler once the workflow actually finished and the notification was sent. Is there a way to do so?

The current assumption would be that the integration of notifications is very useful for workflows that run on a scheduled regular base but rather not for “adhoc notifications”.

If it is too much overhead really depends on how the user works with ONE DATA in general. Setting up a webhook for yourself and configuring the Python processor should not take more than 10 minutes. The user then can simply copy this processor to any other workflow (not just for the current project) which he/she wants to monitor. I think this would be worth the effort if the user regularly builds and runs workflows that take more than half an hour to run, but I guess this is personal preference.

Concerning the scheduler: to my knowledge there is no feature to stop a scheduler based on events (like: workflow xy finished), but maybe there are tricks possible using the API?

One more remark: To my knowledge, the way I suggested is only able to produce a notification in case the workflow finished successfully. There will be no notification sent if the workflow fails before running the Python processor.

1 Like

Hi Viola,

I think Matthias’ proposal to go via outlook webhooks is a very applicable solution for your use case.

You could set up an API call that collects information on all jobs created during the last x minutes.
In the python script you configure a list like 'workflow name: … state: … ’ that will be passed on in the notification.

The scheduler then can run on a regular basis (e.g. every 30 mins). And only send a notification, if there was any activity since the last run.

This still is a quite generic approach. Does it make sense to send the status of all workflows to the same group of recipients? Or do we have to split it up: Each user only gets the workflows they built/ran? The latter could be realized by a master data table that maps users to their own outlook group and then maps this information to the workflow owner/initiator (would have to be done in the python script as well).

What you should consider in my opinion are 2 points

  1. Is it really beneficial to ‘spam’ the user with workflow notifications?
  2. Can your instance spare the extra resources for a (more or less) high frequent scheduler or will it cause performance issues?

Kind regards
Magy

1 Like

Dear all,
a follow up question to the notifications:

How do we enable ONE DATA to work with the webhooks? Is there any configuration to be considered? Or does this work out of the box for all instances?

Thanks!

Hi Vio,
I would say for the solution described above using python and outlook webhooks, you need three things:

  • Python enabled in ONE DATA
  • ONE DATA has to have access to the internet
  • Outlook webhook created: I guess you would need a dedicated Outlook group for that

Best Christoph

1 Like