How to Automate your Content Pipeline with AI
Consider this a blog post to feed my RSS for the AI Use Cases Podcast. This blog post can be automatically generated through a workflow in Make or Gumloop.
The idea is to generate the content automatically through a pipeline I set up. When I find an article I like anywhere, I add the link to the URL of the article to a database with an input. I could literally create an automation on my iPhone that would send it to my email. From the email I could extract the article, save it to a Google Sheet row, and this would trigger the automation in Make or Gumloop.
I could also have a Gumloop UI app that just lets me copy and paste the link and send it on to the same spreadsheet.
Once it updates the spreadsheet row, another automation would be triggered to pick it up and process it. Here are some of the neat ideas it can do for me:
- First, we scrape the article to make sure we have all content.
- If the article is behind a firewall, we can try to use 12ft.io to scrape it.
- We would then scrape all of the authors of each article and then add them to our CRM database after enriching them with another Flow.
- We would also scrape all of the people who were interviewed in each article and add them as well.
- We would then take each article and it would post to Ghost as a new blog article.
- From the Ghost blog RSS feed we could feed Podcast AI
Let's stop here and see how far we can get with this automation. Alternatively, we can post the RSS reader fields of each article to an AirTable or Notion DB. From here we can flag them as ready to post and this could kick off a flow to generate an RSS feed from the DB. We should explore this so we don't necessarily need to lock into Ghost, but we could still use the Ghost platform to automatically generate our own internal blog posts that would feed our newsletter.
Otherwise we could use a Notion DB to feed our Beehive newsletter, but I'm not sure if this integration exists yet. Maybe Feather?