<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Airy Blog]]></title><description><![CDATA[AI Agents & Copilots. ‍Powered by real-time data.]]></description><link>https://blog.airy.co/</link><generator>Ghost 4.2</generator><lastBuildDate>Wed, 08 Apr 2026 07:43:23 GMT</lastBuildDate><atom:link href="https://blog.airy.co/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Airy is joining Confluent]]></title><description><![CDATA[<p><br><em>San Francisco / Berlin &#x2014; October 29, 2025.</em></p><p>We&#x2019;re excited to announce that <strong>Airy&apos;s leadership team is joining Confluent</strong>, the world&#x2019;s leading data streaming platform founded by the creators of Apache Kafka.</p><p>From day one, our mission at Airy has been simple but ambitious: to</p>]]></description><link>https://blog.airy.co/airy-is-joining-confluent/</link><guid isPermaLink="false">690240b6217001046822e8bf</guid><dc:creator><![CDATA[Steffen Hoellinger]]></dc:creator><pubDate>Wed, 29 Oct 2025 16:31:26 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2025/10/Screenshot-2025-10-29-at-12.37.09.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2025/10/Screenshot-2025-10-29-at-12.37.09.png" alt="Airy is joining Confluent"><p><br><em>San Francisco / Berlin &#x2014; October 29, 2025.</em></p><p>We&#x2019;re excited to announce that <strong>Airy&apos;s leadership team is joining Confluent</strong>, the world&#x2019;s leading data streaming platform founded by the creators of Apache Kafka.</p><p>From day one, our mission at Airy has been simple but ambitious: to provide the <strong>perfect context for AI models at inference</strong> by processing data continuously.</p><p>We&#x2019;ve built <strong>Schema Intelligence</strong> and <strong>AI Copilots</strong> to translate natural language into Flink SQL &#x2014; enabling teams and AI Agents to understand, reason about, and act on data in motion.</p><p>Joining Confluent marks the next chapter of that journey. Together, we&#x2019;ll help accelerate the convergence of <strong>stream processing and AI</strong>, enabling organizations not only to process data continuously but also to make <strong>AI Agents context-aware and event-driven</strong> &#x2014; understanding schemas and ontologies, considering data lineage and governance, and making stream processing in Flink much more accessible through natural language.</p><p>The future of data and AI is <strong>real-time, intelligent, and context-aware</strong> &#x2014; and we can&#x2019;t wait to help shape it within Confluent.</p><p><strong><strong><a href="https://www.confluent.io/blog/2025-q4-confluent-cloud-launch/">Confluent&apos;s Statement on upcoming AI Features: Data in Motion for AI in Action</a></strong></strong><br><br></p>]]></content:encoded></item><item><title><![CDATA[Airy Co-pilot: Democratizing stream processing capabilities for the enterprise]]></title><description><![CDATA[Airy Co-pilot offers a practical approach to data management, enabling real-time streaming and natural language interaction. It aims to simplify data analysis and visualization, helping teams make informed decisions in today's data-driven landscape.]]></description><link>https://blog.airy.co/airy-co-pilot-democratize-streaming/</link><guid isPermaLink="false">653be0e7217001046822e847</guid><dc:creator><![CDATA[Aitor Algorta]]></dc:creator><pubDate>Fri, 27 Oct 2023 16:23:07 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2023/10/Screenshot-2023-09-24-at-00.13.10-1.png" medium="image"/><content:encoded><![CDATA[<h3 id="the-stream-becomes-the-source-of-truth">The Stream becomes the Source of Truth</h3><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2023/10/Screenshot-2023-10-27-at-18.15.06.png" class="kg-image" alt="Airy Co-pilot: Democratizing stream processing capabilities for the enterprise" loading="lazy" width="1352" height="474" srcset="https://blog.airy.co/content/images/size/w600/2023/10/Screenshot-2023-10-27-at-18.15.06.png 600w, https://blog.airy.co/content/images/size/w1000/2023/10/Screenshot-2023-10-27-at-18.15.06.png 1000w, https://blog.airy.co/content/images/2023/10/Screenshot-2023-10-27-at-18.15.06.png 1352w" sizes="(min-width: 720px) 720px"><figcaption>The LLM knows about the streaming infrastructure &amp; can translate the prompt to stream processing tasks and enable the (disposable) deployment of apps running Kafka Streams or Apache Flink&#xAE; jobs through Airy.</figcaption></figure><img src="https://blog.airy.co/content/images/2023/10/Screenshot-2023-09-24-at-00.13.10-1.png" alt="Airy Co-pilot: Democratizing stream processing capabilities for the enterprise"><p>In an era where businesses are inundated with a diverse array of data and increasingly reliant on AI, the challenge of effectively managing, analyzing, and extracting insights from structured, semi-structured, and unstructured data has never been greater.</p><p>This is where Airy comes in as an open-source app framework to empower developers and data engineering teams to build customizable co-pilot applications for their relevant organizations on their own data with Airy Co-pilot. Thereby, they enable business users and technical stakeholders alike to interact with streaming data in the most simple way: via natural language.</p><h3 id="technical-requirements-and-llm-integration">Technical Requirements and LLM Integration</h3><p>The development of Airy Co-pilot is grounded in several essential technical components:</p><ol><li><strong>Developing a Frontend Application</strong>: Our objective is to craft a user-friendly frontend application under &quot;frontend/co-pilot&quot; that supports real-time data streaming, offering a robust and intuitive interface.</li><li><strong>Vector Database and CI Integration</strong>: We&apos;re focused on ensuring a smooth interaction between the frontend, LLM and vector database endpoints, complemented by a reliable continuous integration pipeline. </li><li><strong>LLM Integration for Enhanced Data Interaction</strong>: Integrating LLM technology is a strategic move to boost the application&#x2019;s capabilities in data analysis and streaming, providing a more dynamic and responsive user experience.</li></ol><h2 id="the-solution-airy-co-pilot">The Solution: Airy Co-pilot</h2><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/10/-MOV-to-GIF.gif" class="kg-image" alt="Airy Co-pilot: Democratizing stream processing capabilities for the enterprise" loading="lazy" width="2556" height="1320"></figure><h3 id="key-features">Key Features</h3><ul><li><strong>Conversational Data Interaction</strong>: The application features a conversational interface, allowing users to engage with data in a more natural and intuitive manner.</li><li><strong>Actionable Insights and Real-Time Data Access</strong>: Providing immediate access to data and insights, this feature is designed to support prompt and informed decision-making.</li><li><strong>Data Visualization</strong>: With data visualization capabilities, Airy Co-pilot transforms complex data sets into clear and comprehensible visuals, aiding in the interpretation and presentation of data insights.</li></ul><h3 id="benefits-for-companies">Benefits for Companies</h3><ul><li><strong>Simplified Data Access and Streaming</strong>: Airy Co-pilot simplifies data access and streaming, notably with its translation capabilities from natural language into FlinkSQL and Kafka Streams powered applications, making it a valuable asset for businesses.</li><li><strong>Efficient Decision-Making</strong>: The application is crafted to streamline access to insights, facilitating efficient and effective decision-making processes.</li><li><strong>Increased Productivity</strong>: By easing data streaming and translation tasks, Airy Co-pilot is geared towards boosting productivity and saving valuable time and resources.</li><li><strong>Improved Data Literacy</strong>: The application aims to make data analysis more approachable and user-friendly, contributing to enhanced data literacy especially among business users, removing the need to rely on data engineers for simple stream processing tasks.</li></ul><h3 id="conclusion">Conclusion</h3><p>Airy Co-pilot is a thoughtful response to the complexities of data management and streaming. It&apos;s designed to offer businesses a reliable way to navigate their data efficiently. Our commitment is to its ongoing development, focusing on enabling teams to build streaming applications that genuinely enhance their interaction with data.</p>]]></content:encoded></item><item><title><![CDATA[Kafka as a Single Source of Truth]]></title><description><![CDATA[<p>Apache Kafka is the most popular distributed event streaming platform mostly used for transforming and moving the data between different systems. We can find plenty of use cases for data pipelines, mission-critical event driven systems, data integrations and transformations, but not many platforms use Kafka as a single database. That</p>]]></description><link>https://blog.airy.co/kafka-as-a-single-source-of-truth/</link><guid isPermaLink="false">64020ea9217001046822e4ce</guid><dc:creator><![CDATA[Ljupco Vangelski]]></dc:creator><pubDate>Thu, 27 Apr 2023 08:14:53 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2023/05/streams-3.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2023/05/streams-3.jpg" alt="Kafka as a Single Source of Truth"><p>Apache Kafka is the most popular distributed event streaming platform mostly used for transforming and moving the data between different systems. We can find plenty of use cases for data pipelines, mission-critical event driven systems, data integrations and transformations, but not many platforms use Kafka as a single database. That is why we decided to share our decisions why we choose Kafka to be the single source of truth for Airy, as well as the experiences that we had along our journey.</p><!--kg-card-begin: markdown--><h2 id="airy-requirements-concepts">Airy requirements &amp; concepts</h2>
<p>At the start it is fair that we first we exlain our platform requirements and development principles. Four years ago we were fortunate to start working on a completely new streaming platform, determined to transform conversational experiences. As we wanted to connect conversational, business and machine leraning systems in real-time, we had few requirements in place for our platform:</p>
<ul>
<li>It should be an event driven system</li>
<li>There needs to be a microservice architecture &amp; service independence</li>
<li>The data should be in order and easily re-processable</li>
<li>There should be a single data storage for all the event data</li>
<li>We should be able to add business logic in the stream and in real-time</li>
<li>The platform should be able to scale</li>
</ul>
<p>Naturaly the choice for the core of the system came to Apache Kafka, as it fits all of these requirements. For adding the business logic in the stream we choose to rely on the <a href="https://docs.confluent.io/platform/current/streams/index.htmlhttp://">Kafka Streams library</a> for building our components. And as an infrastructure where everything should run we opted for Kubernetes as it is a perfect environment for deploying microservices, that are able to scale both horizontally and vertically, with lots of other enterprise-grade features that are independent of any cloud provider.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/technologies.jpg" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="562" height="124"></figure><hr><!--kg-card-begin: markdown--><h2 id="single-source-of-truth">Single source of truth</h2>
<p>As we needed data from different places at the beginning we were really determined to have a <strong>single source of truth</strong> and avoid having data in different places, not being able to read and process it in a standardized way. But what is all that data that we need to store in a platform like Airy? First is all the <strong>conversational data</strong> which means all the messages and events that are ingested from the verious conversational platforms such as Facebook, Google, Instagram, Twilio, SMS and so on. In order to design automations and enrich the conversations we need to also fetch data from various <strong>business systems</strong>, for example Zendesk, Salesforce, purchase data or other customer databases.<br>
The same datastore should also hold <strong>configuration settings</strong> about the connectors and preferences for the users using the platform. It should be able to also store <strong>structured contact data</strong> about the contacts that are being ingested and aggregated over different channels.<br>
Last but not least the same data store should hold all the <strong>logs and events</strong> that are generated by the applications, including metrics, monitoring data and telemetry.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/single-source-on-top.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="1913" height="1304" srcset="https://blog.airy.co/content/images/size/w600/2023/03/single-source-on-top.png 600w, https://blog.airy.co/content/images/size/w1000/2023/03/single-source-on-top.png 1000w, https://blog.airy.co/content/images/size/w1600/2023/03/single-source-on-top.png 1600w, https://blog.airy.co/content/images/2023/03/single-source-on-top.png 1913w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><p>One would legimitely ask: <strong>But is Kafka the appropriate data store for all that?</strong> We figured out: Yes! We should have all the data on top of which we want to build real-time business logic in Kafka. This way we are flexible with the variety of features and applications we can add on top, while using a standard way to access and process the data. In a way we are able to transparently <strong>treat all the data the same</strong>, regardless of where it comes from and what should it trigger.</p>
<p>Even though there is an option to combine Kafka with other relational and non-relational databases, that way we would have the same information in multiple places, would be hard to determine which is the source of truth and the unification of the development process would become harder. And to be honest Kafka Streams helped our decision because it is a great library to have stateless and stateful operations on top of the data in Kafka and in real-time.</p>
<p>So, to wrap it up, all data lives in Kafka. Everything goes into the stream and that is where every app reads from to built its own <strong>version of reality</strong>, a very important principle for having complete service independence. So the data can also be seen as a strongly typed (via <a href="https://avro.apache.org/">Avro</a>) data pipeline. Let&apos;s see now how this pipeline powers all the apps and services.</p>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: markdown--><h2 id="microservice-design-architecture">Microservice design &amp; architecture</h2>
<p>Airy is built of Kafka Streaming Apps called components. They can be of three types:</p>
<ul>
<li><strong>Transformers</strong> - Consume data from topics, perform stateful and stateless transformations and write to other topics.</li>
<li><strong>Services</strong> - Consume data from topics, perform transformations, hold the state in internal topics and expose endpoints.</li>
<li><strong>Connectors</strong> - Ingest data from external systems into Kafka and send data from Kafka to external systems.</li>
</ul>
<p>We choose to build our own <code>Airy connectors</code> and not build on top of <code>kafka-connect</code> because we had the need for:</p>
<ul>
<li>Sending and receiving data within the same connector (not to have separate <code>sink</code> and <code>source</code> connectors).</li>
<li>Add business logic inside the connector and not simply get the data from one place to another.</li>
</ul>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/Components_apps-1.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="1491" height="653" srcset="https://blog.airy.co/content/images/size/w600/2023/03/Components_apps-1.png 600w, https://blog.airy.co/content/images/size/w1000/2023/03/Components_apps-1.png 1000w, https://blog.airy.co/content/images/2023/03/Components_apps-1.png 1491w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><p>For example, as illustrated on the image above, a message is injected into Kafka through a microservice in a connector. Afterwards another microservice in the same connector reads this data, transforms it and writes it into another topic. This event is a trigger for other components to do something with this data. Here the true power of Kafka comes into place, as we can have components with higher priority to which we dedicate more resources and also components with a lower priority, but who read the same data on <code>best-efforts</code> workloads.</p>
<p>Every microservice has its own <code>topology</code> which is a representation of the relations between the data that is read and the operations performed on this data. As mentioned above, these operations can be stateful operations such as <code>joins</code> and <code>aggregations</code> and the result from them is stored into internal topics. When the schema for the resulting topics needs to change or we need to change the topology, these Kafka Streaming Apps need to be reset and the data in these internal topics will be deleted. Then the microservice starts to re-process the data again, from <code>beginning</code>, <code>latest</code> or from a specific <code>offset</code>.</p>
<p>One important aspect of our HTTP endpoints of services is that they are built on top of a very useful feature of Kafka Streams called <a href="https://kafka.apache.org/documentation/streams/developer-guide/interactive-queries.html">interactive queries</a>. As the stateful operation on multiple topics is performed and the state is stored into internal topics, interactive queries allow this structured data to be exposed to the outside world.</p>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: markdown--><h2 id="learnings">Learnings</h2>
<p>Of course everything looks great on the design whiteboard, but unfortunateyl real-time and scalable event-driven platforms don&apos;t run on whiteboards.</p>
<p>Here are some of the obstacles that we had along our journey and some of the learnings that we made.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h3 id="learning-1-endpoint-performance">Learning 1: Endpoint performance</h3>
<p>The services behind the endpoints need to be fast. As explained before, this data is loaded from the internal topics and that can take time for larger data sets. Fortunately the Interactive queries support local and remote state stores, which means that a local state is kept on the workload in a form a cached version of the data created by the particular topology. To increase the performance, which is particularly important when the microservices start, it is useful to have these workloads as StatefulSets in Kubernetes, provisioning a stateful volume where there is a cached version of the data from the internal topics. As Kafka Streams uses a default RocksDB as a default storage to maintain local state, having this storage persistent will help with improving the endpoint performance.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/KOIN3z.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="381" height="261"></figure><!--kg-card-begin: markdown--><h3 id="learning-2-changing-the-schema">Learning 2: Changing the schema</h3>
<p>The schema holds the structure of the data and when working with Kafka it can be stored in a separate service called the <code>Schema Registry</code>. Changing the schema of the data is inevitable over time. But as the data with the specific schema is already in the internal topics of the Kafka Streams Apps and in the RocksDB cache, thich can be a painful process. If we want to change the schema of a microservice, usually the procedure to do this is:</p>
<ul>
<li>Scale down the StatefulSets</li>
<li>Reset the kafka-streaming-app</li>
<li>Delete the persistent storage</li>
<li>Scale up the workloads</li>
<li>Wait for the reprocessing</li>
</ul>
<p>But going through this approach will mean that there would be a downtime. And depending on the amount of the data and the time it takes to reprocess it it can be quite significant. To mitigate the downtime we can leverage the service or loadBalancer feature of Kubernetes and deploy a new app that can introduce the new schema while the first app is still working.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/statefulsets-change-schema.drawio.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="561" height="261"></figure><!--kg-card-begin: markdown--><p>The new microservice however needs to have a different name for the consumer group in Kafka because that is how Kafka distinguishes different consumers.</p>
<p>Even through now there is no preasure with reprocessing the data for <code>App v11</code> as <code>App v10</code> is still running, there might be a case where we need to fix a bug in the existing app. In this case, it is important that the second app consumes all the needed topics and becomes ready to serve the clients faster. We can achieve this if we use workloads with more CPU and memory only during the period while the data is reprocessed and the Kafka Streams App gets in a <code>RUNNING</code> state. One way to achieve this if creating very powerful compute nodes in the cloud (we call them <code>boost-k8s-nodes</code>) on the fly and joining them to the Kubernetes cluster. Then we need to set more resources and label the <code>App v11</code> workloads so that they get scheduled on the new nodes. Once the reprocessing of the messages is done, we can scale down the resources for the new app and deploy them of regulat nodes. Important not to forget ot destroy the <code>boost-k8s-nodes</code> because they can lead to large costs.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h3 id="learning-3-unbalanced-partitions">Learning 3: Unbalanced partitions</h3>
<p>One more thing that we learned is that it is important to choose the correct keys for the data in Kafka. Since one Kafka broker is responsible for a particular topic-partition, having some keys that receive lots of data will lead to unbalanced partitions and unbalanced load distribution.<br>
For example, if we key messages with <code>organization_id</code> and have some organizations that receive the majority of the data - this can cause performance problems on the broker side. As data for the same key will be written to the same topic-partition, some brokers will need to work very hard when writing and reading the data (for these organizations) and some will be idle. The microservices that need to consume a topic will also be blocked as they consume a topic from different partitions and the response time of the Kafka brokers under load will impact the overall responsiveness of the system.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/unbalanced.partitions.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="631" height="490" srcset="https://blog.airy.co/content/images/size/w600/2023/03/unbalanced.partitions.png 600w, https://blog.airy.co/content/images/2023/03/unbalanced.partitions.png 631w"></figure><!--kg-card-begin: markdown--><h3 id="learning-4-webhook-events">Learning 4: Webhook events</h3>
<p>Our platform uses a webhook component to send events to external systems. As we cannot assume that these systems will be idempotent, these events must be sent only once. However, as the webhook events are read from topics inside Kafka, when there is an outage or during broker upgrades, information about the current offset of these topics can be lost. Even though the webhook component consumes from the <code>latest</code> messages in a topic, if there is an inconsistency in the offsets it can happen that a message is sent multiple times.<br>
This problem can be particularly hard to debug and investigate as we would like to offload the responsibility to Kafka to keep the correct offset for every topic. What we can do to mitigate this problem and avoid such inconveniences, like triggering a CRM system with the same event multiple times, we can add some application logic to the webhook deployment and double-check that a certain event should be sent on the side of the webhook component.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/webhooks.drawio.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="845" height="361" srcset="https://blog.airy.co/content/images/size/w600/2023/03/webhooks.drawio.png 600w, https://blog.airy.co/content/images/2023/03/webhooks.drawio.png 845w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><h3 id="learning-5-starting-up-all-at-once">Learning 5: Starting up all at once</h3>
<p>Kafka Streams Apps require lots of CPU at startup. At first they go in a <code>REBALANCING</code> state to check the consistency of the data, either in the RocksDB cache or in the internal topics. Once the data is there and broker assignments are completed for looking for new data, these apps go into a <code>RUNNING</code> state and then the CPU usage reduces. At the same time there is a spike of the CPU usage in Kafka as the Kafka brokers need to respond to all of these requests from the apps getting ready to start. If we have hard limits on the total resources that the workloads can consume, when everything starts at once (for example during upgrades or when a pool of Kubernetes nodes die) we can have a problem that the workloads don&#x2019;t get the needed CPU to get into a <code>RUNNING</code> state.</p>
<p>One way we managed to mitigate this behavior is using <code>init containers</code> in the Kubernetes workloads to look for conditions whether an app can start at all (for example if Kafka is still not ready). We can configure our CD pipeline not to upgrade all the components at the same time, but this will not affect the behavior when some nodes in Kubernetes get replaced and Kubernetes schedules all workloads on another node. In this case, creating (and testing) an <code>auto-scaling</code> configuration for the particular group of Kubernetes resources will be helpful as the workloads can get more resources when they need.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h3 id="learning-6-backup">Learning 6: Backup</h3>
<p>All the data is in one place, right? So in that sense - <code>backup</code> is easy.  As long as we have a backup of Kafka - we are good. Unfortunately it is not so simple.</p>
<p>Let&#x2019;s assume the most straightforward scenario where we have the data in Kafka in persistent volumes (for example EBS volumes in AWS). We need to make periodical snapshots of these volumes and that is the easy part. But <code>restore</code> is not so easy. If we need to restore the data to a certain point in time we need to: 1) Stop all the Brokers, 2) Restore all of the snapshots from the same backup cycle, 3) Start all the brokers at once, 4) Wait for them to rebalance and get in a running state.</p>
<p>This will restore all the data in Kafka to a certain point in time. But what if we want to restore only part of the data or only one topic? Then this solution doesn&#x2019;t work anymore. If this is needed it would be best to make use of <code>kafka-connect</code> and stream all the topics independently to external storage. One example is the <code>s3 sink connector</code> that will write all the data in real-time to an S3 bucket. Note that not all connectors support the same ordering of the data, so if this is important you may want to choose a connector that can guarantee that.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/backup-data.drawio.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="857" height="421" srcset="https://blog.airy.co/content/images/size/w600/2023/03/backup-data.drawio.png 600w, https://blog.airy.co/content/images/2023/03/backup-data.drawio.png 857w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><h3 id="learning-7-tools">Learning 7: Tools</h3>
<p>Tooling is very important for maintaining every platform and Airy is not an exception. But as all of the data is in Kafka we need tools that can interact with Kafka and help us with:</p>
<ul>
<li>Checking the state of the broker</li>
<li>Looking into the data in the topics</li>
<li>Searching for a particular key/value in a topics</li>
<li>Looking into the state of the consumer groups and check the lag</li>
<li>Check the status of Kafka connect</li>
<li>View the schema for a particular topic</li>
<li>See the rate at which data is written/read to/from Kafka</li>
</ul>
<p>As a complete turn-key solution - <a href="https://www.confluent.io/confluent-cloud/">Confluent cloud</a> is very useful for getting a clear insight of the data in Kafka. Clusters can be created and destroyed easily with Terraform, as well as topics, access lists and service accounts. This works however only if you have the Kafka clusters in Confluent.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/M0YN33.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="1391" height="899" srcset="https://blog.airy.co/content/images/size/w600/2023/03/M0YN33.png 600w, https://blog.airy.co/content/images/size/w1000/2023/03/M0YN33.png 1000w, https://blog.airy.co/content/images/2023/03/M0YN33.png 1391w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><p>If you choose to run and manage your own Kafka then the first tool that everyone should have is <a href="https://akhq.io/">AKHQ</a>. It can be deployed easily with Helm in Kubernetes, needs access to the Kafka cluster and can offer great insight in the state of the data and the platform.</p>
<!--kg-card-end: markdown--><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2023/03/akhqScreenshot-from-2023-03-06-16-56-12.png" class="kg-image" alt="Kafka as a Single Source of Truth" loading="lazy" width="1695" height="758" srcset="https://blog.airy.co/content/images/size/w600/2023/03/akhqScreenshot-from-2023-03-06-16-56-12.png 600w, https://blog.airy.co/content/images/size/w1000/2023/03/akhqScreenshot-from-2023-03-06-16-56-12.png 1000w, https://blog.airy.co/content/images/size/w1600/2023/03/akhqScreenshot-from-2023-03-06-16-56-12.png 1600w, https://blog.airy.co/content/images/2023/03/akhqScreenshot-from-2023-03-06-16-56-12.png 1695w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><p>Apart from these great UIs there, <a href="https://github.com/edenhill/kcat">kcat</a> is a great CLI that can be used to interact with Kafka.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h3 id="the-positive-leranings">The Positive Leranings</h3>
<p>Even though it was a bumpy ride and the learning curve is a bit steep, using Kafka as a single source of truth proved to be a great choice for our platform. We achieved an unified development process with great flexibility, speed and agility. All the data is in one place, so it is easy to govern, segment and use.</p>
<p>When the data is in Kafka one cannot just go there and change or delete a record in the database. It is a streaming platform and this forces you to solve data issues programmatically, not allowing shortcuts such as editing the database to fix a data problem. For example, when there is a Null Point Exception a new version of the microservice needs to be created solving the issue.</p>
<p>Another positive learning is that Kafka Streams actually solves lots of issues for distributed processing. Scaling consumers and producers is very convenient and Kafka distributes the workload automatically. When a consumer or a producer dies, all the information is stored in Kafka so they can just restart and continue where they left off. And Interactive queries are a great way to expose structured data in Kafka via HTTP.</p>
<p>Kubernetes has proven very convenient for running Kafka under low and moderate load, even though for higher Kafka loads we would recommend running a dedicated instance for every Kafka broker.</p>
<p>So making the stream <strong>the single source of truth</strong> is something that we can recommend. Most importantly all data can be accessed and processed in the same way and in real time, which is very important for building a robust event-driven platform.</p>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: markdown--><h2 id="data-retention">Data Retention</h2>
<p>When using Kafka as a central dataase the data retention question becomes important as usually we don&apos;t want to delete events from our <code>single source of truth</code>. But keeping all the data can lead to high disk usage and large volumes, so we need to decide which data is important to keep and which is not.</p>
<p>Fortunately Kafka has a great feature for this called <code>data compaction</code>. We can actually decide that we keep at least the latest <code>value</code> for an unique <code>key</code> that gets written to a topic, instead of deleting it.</p>
<p>Here is a snippet of our scripts for creating the topics:</p>
<pre><code>kafka-topics.sh --create --if-not-exists &quot;${CONNECTION_OPTS[@]}&quot; --replication-factor &quot;${REPLICAS}&quot; --partitions &quot;${PARTITIONS}&quot; --topic &quot;${AIRY_CORE_NAMESPACE}application.communication.contacts&quot; --config cleanup.policy=compact  min.compaction.lag.ms=86400000 segment.bytes=10485760

kafka-topics.sh --create --if-not-exists &quot;${CONNECTION_OPTS[@]}&quot; --replication-factor &quot;${REPLICAS}&quot; --partitions &quot;${PARTITIONS}&quot; --topic &quot;${AIRY_CORE_NAMESPACE}application.communication.messages&quot; --config cleanup.policy=compact min.compaction.lag.ms=86400000 segment.bytes=10485760

kafka-topics.sh --create --if-not-exists &quot;${CONNECTION_OPTS[@]}&quot; --replication-factor &quot;${REPLICAS}&quot; --partitions &quot;${PARTITIONS}&quot; --topic &quot;${AIRY_CORE_NAMESPACE}application.communication.metadata&quot; --config cleanup.policy=compact min.compaction.lag.ms=86400000 segment.bytes=10485760
</code></pre>
<p>When it comes down to compaction it is important that all the conditions are created so that compaction kicks in. The data stored in the topic-partitions is stored in segments that are individual files written to the disk. The last segment is the one that Kafka writes to in real-time so compaction is done only from the messages in the previous segments. Having that in mind it is important that we configure the size of the segment in accordance to the amount of messages that that topic is receiving. Because if there is only one large segment - none of the messages will actually be compacted and will still be kept on disk.</p>
<p>It is important to say that campacting the messages removes historical data so we need to be sure that we are good with that. For example we have messages that we key with a <code>messageId</code>. If we compact that topic, then we for sure keep the value or the content of the last message, but historical information about previous versions of the message will be lost.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h2 id="references">References</h2>
<p>[1] Airy Core Open source repository - <a href="https://github.com/airyhq/airy/">https://github.com/airyhq/airy/</a><br>
[2] Airy design principles - <a href="https://airy.co/docs/core/concepts/design-principles">https://airy.co/docs/core/concepts/design-principles</a><br>
[3] Apache Kafka - <a href="https://kafka.apache.org/">https://kafka.apache.org/</a><br>
[4] Kafka Streams - <a href="https://kafka.apache.org/documentation/streams/">https://kafka.apache.org/documentation/streams/</a><br>
[5] Confluent Cloud - <a href="https://www.confluent.io/confluent-cloud/">https://www.confluent.io/confluent-cloud/</a><br>
[6] AKHQ - <a href="https://akhq.io/">https://akhq.io/</a><br>
[7] Backup of EBS volumes on AWS - <a href="https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSSnapshots.html">https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EBSSnapshots.html</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Integrating all Communication and Customer Data with Airy and Confluent]]></title><description><![CDATA[Learn how to build world-class customer experiences by integrating your Communication and Customer Data with open-source Airy and Confluent.]]></description><link>https://blog.airy.co/streaming-data-airy-confluent/</link><guid isPermaLink="false">62d5c82fd1b810052e0ff7be</guid><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Christoph Pröschel]]></dc:creator><pubDate>Thu, 21 Jul 2022 17:03:37 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2022/07/blue-abstract-wave-abstract-vector-background-wave_206325-406.jpg" medium="image"/><content:encoded><![CDATA[<h3 id="a-quick-introduction-to-airy-and-confluent"><br>A quick introduction to Airy and Confluent</h3><img src="https://blog.airy.co/content/images/2022/07/blue-abstract-wave-abstract-vector-background-wave_206325-406.jpg" alt="Integrating all Communication and Customer Data with Airy and Confluent"><p>Airy is an open-source data streaming platform offering an integrated solution to automate &amp; personalize customer communication. Developers can build scalable, world-class customer experiences by integrating data from different sources in real-time.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2022/07/Airy_Stream_Explainer.png" class="kg-image" alt="Integrating all Communication and Customer Data with Airy and Confluent" loading="lazy" width="1149" height="467" srcset="https://blog.airy.co/content/images/size/w600/2022/07/Airy_Stream_Explainer.png 600w, https://blog.airy.co/content/images/size/w1000/2022/07/Airy_Stream_Explainer.png 1000w, https://blog.airy.co/content/images/2022/07/Airy_Stream_Explainer.png 1149w" sizes="(min-width: 720px) 720px"></figure><p><br>Confluent is a full-scale data streaming platform that enables developers to easily access, store and manage data as continuous, real-time streams. Built by the original creators of Apache Kafka, Confluent expands the benefits of Kafka with enterprise-grade features while removing the burden of Kafka management and monitoring.</p><h3 id="airy-and-apache-kafka">Airy and Apache Kafka</h3><p>Airy&apos;s open-source data streaming platform is powered by Apache Kafka, the open-source event streaming standard. <br><br>By default, you get a small Kafka deployment when <a href="https://airy.co/docs/core/getting-started/installation/introduction">installing Airy</a> on your Kubernetes cluster. While this is a great way to get started and play around, we recommend our enterprise offerings and a hosted Kafka service such as <a href="https://www.confluent.io/confluent-cloud/">Confluent Cloud</a><strong> </strong>for production workloads.</p><p>Confluent is the primary maintainer of Kafka and, as such, provides a very mature cloud offering. In this guide, I will show you how to set up your Confluent Kafka cluster and connect it to Airy.</p><p><strong>Prerequisites: </strong></p><ul><li>A Confluent Cloud account - <a href="https://confluent.cloud/signup">Signup here</a></li></ul><p><strong>Follow the next steps to set up Airy with Confluent:</strong></p><ol><li><a href="#step-1-create-a-kafka-cluster">Create a Kafka cluster on Confluent </a></li><li><a href="#step-2-obtain-connection-configuration">Obtain connection configuration</a></li><li><a href="#step-3-configure-airy-to-use-confluent">Configure Airy to use Confluent</a></li><li><a href="#step-4-check-that-everything-works">Check that everything works</a> &#x1F389;</li></ol><h3 id="step-1-create-a-kafka-cluster">Step 1: Create a Kafka cluster</h3><p>Assuming that you already have a Confluent account (<a href="https://confluent.cloud/signup">signup here</a>), we move on to creating the Kafka cluster by following <a href="https://docs.confluent.io/cloud/current/clusters/create-cluster.html">these instructions</a>. At the time of writing, Confluent provides a free trial that will give you plenty of time and room to explore this setup.</p><p>Once you are done, your environments page will show the default view with one cluster running:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2022/07/image.png" class="kg-image" alt="Integrating all Communication and Customer Data with Airy and Confluent" loading="lazy" width="994" height="782" srcset="https://blog.airy.co/content/images/size/w600/2022/07/image.png 600w, https://blog.airy.co/content/images/2022/07/image.png 994w" sizes="(min-width: 720px) 720px"><figcaption>Upon successfully creating a cluster it should show up in your Confluent dashboard like so</figcaption></figure><h3 id="step-2-obtain-connection-configuration">Step 2: Obtain connection configuration</h3><p>To connect Airy to a Confluent Kafka cluster, we need to obtain two parameters: the address of the cluster and the authentication string. Airy supports Confluent&apos;s authentication method, which is SASL/PLAIN. You can read more about Confluent&apos;s authentication process <a href="https://docs.confluent.io/platform/current/kafka/authentication_sasl/index.html#jaas-configurations">here</a>.</p><p>To obtain these values, click on your cluster in the Confluent dashboard and then select the Java connection option:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2022/07/image-1.png" class="kg-image" alt="Integrating all Communication and Customer Data with Airy and Confluent" loading="lazy" width="2000" height="672" srcset="https://blog.airy.co/content/images/size/w600/2022/07/image-1.png 600w, https://blog.airy.co/content/images/size/w1000/2022/07/image-1.png 1000w, https://blog.airy.co/content/images/size/w1600/2022/07/image-1.png 1600w, https://blog.airy.co/content/images/size/w2400/2022/07/image-1.png 2400w" sizes="(min-width: 720px) 720px"><figcaption>When inspecting your empty cluster you will prompted with the option to connect clients. Select Java to proceed.</figcaption></figure><p>This will present you with the screen below. Click &quot;Create Kafka cluster API key&quot; to generate the parameters we need.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2022/07/image-2.png" class="kg-image" alt="Integrating all Communication and Customer Data with Airy and Confluent" loading="lazy" width="2000" height="1233" srcset="https://blog.airy.co/content/images/size/w600/2022/07/image-2.png 600w, https://blog.airy.co/content/images/size/w1000/2022/07/image-2.png 1000w, https://blog.airy.co/content/images/size/w1600/2022/07/image-2.png 1600w, https://blog.airy.co/content/images/2022/07/image-2.png 2096w" sizes="(min-width: 720px) 720px"><figcaption>On this view we can go ahead and create our API keys and then copy the address of the cluster (&quot;bootstrap.servers&quot;) and the sasl config (&quot;sasl.jaas.config&quot;)</figcaption></figure><p>Copy/save the address from <em>bootstrap.servers</em> and the SASL configuration line from <em>sasl.jaas.config </em>(starting at &quot;org.apache&quot; up to and including the semicolon), lines 2 and 4 . </p><hr><h3 id="step-3-configure-airy-to-use-confluent">Step 3: Configure Airy to use Confluent</h3><p>Now that we have these values, we can connect your Confluent Kafka cluster to Airy. If you are installing a new Airy instance, follow path A, and if you already have a running Airy instance, follow path B.</p><h4 id="a-confluent-on-a-new-airy-instance-via-helm">A. Confluent on a new Airy Instance via Helm</h4><p>Follow this <a href="https://airy.co/docs/core/getting-started/installation/helm">guide</a> to set up your Helm installation. By default, Airy ships with its own Kafka cluster. To use your Confluent Kafka cluster from the get-go, you need to include this configuration in your <code>airy.yaml</code> file <strong>before running the helm install command</strong>.</p><pre><code class="language-yaml">config:
  kafka:
    brokers: &quot;bootstrap.server address from Step 2&quot;
    authJaas: &quot;sasl.jaas.config line from Step 2&quot;
    minimumReplicas: 3
    zookeepers: &quot;&quot;
    schemaRegistryUrl: http://schema-registry:8081</code></pre><p>Now you can run these commands to install your cluster:</p><pre><code class="language-bash">$ helm repo update
$ helm install airy airy/airy --timeout 15m --values ./airy.yaml
</code></pre><h4 id="b-confluent-on-an-existing-airy-instance-via-the-cli">B. Confluent on an existing Airy Instance via the CLI</h4><p><strong>Note:</strong> This guide does not cover migrating existing data to a different Kafka cluster. </p><p>To connect a Confluent Kafka cluster to a running Airy Instance, include this configuration in your <code>airy.yaml</code> file.</p><pre><code class="language-yaml">kafka:
  brokers: &quot;bootstrap.server address from Step 2&quot;
  authJaas: &quot;sasl.jaas.config line from Step 2&quot;
  minimumReplicas: 3
  schemaRegistryUrl: http://schema-registry:8081
</code></pre><p>Now run <code>airy config apply</code> in your workspace directory to propagate these configs to our streaming apps. The difference to the helm approach is that this does not automatically create the required Kafka topics for us. So our apps will temporarily be in a crashing state until we fix this by re-running the initial topics provisioning job:</p><pre><code class="language-bash">$ kubectl get job &quot;provisioning-topics&quot; -o json | jq &apos;del(.spec.selector)&apos; | jq &apos;del(.spec.template.metadata.labels)&apos; | kubectl replace --force -f -
</code></pre><p>For more details about connecting to a remote Kafka cluster and connecting your own schema registry, you can check out our documentation <a href="https://docs.airy.co/guides/remote-kafka-cluster">here</a>. </p><h3 id="step-4-check-that-everything-works">Step 4: Check that everything works</h3><p>After completing either 3A or 3B, we have to wait for a few minutes for our streaming applications to start back up. Once this is done and all Kubernetes pods are stable, you should see Airy apps in the &quot;Data Integration &gt; Clients&quot; view of your Confluent dashboard:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2022/07/image-4.png" class="kg-image" alt="Integrating all Communication and Customer Data with Airy and Confluent" loading="lazy" width="1786" height="456" srcset="https://blog.airy.co/content/images/size/w600/2022/07/image-4.png 600w, https://blog.airy.co/content/images/size/w1000/2022/07/image-4.png 1000w, https://blog.airy.co/content/images/size/w1600/2022/07/image-4.png 1600w, https://blog.airy.co/content/images/2022/07/image-4.png 1786w" sizes="(min-width: 720px) 720px"><figcaption>The clients view of your Confluent cluster should now show Airy apps</figcaption></figure><p>Finally, you should open up your Airy <a href="https://airy.co/docs/core/ui/control-center/introduction">control center</a> or <a href="https://airy.co/docs/core/ui/inbox/introduction">inbox</a> to ensure everything is working as expected.</p><h3 id="where-to-go-from-here">Where to go from here</h3><p>If you are struggling to complete this guide or want to share your journey, please <a href="https://join.slack.com/t/airy-developers/shared_invite/zt-ijzwq90x-QT1D75BtaxK3JWehsB_FEg">join</a> our developer slack! We&apos;d love to meet you &#x1F973;</p><p>And if you&apos;re interested in implementing this solution in production and at scale, we&apos;d be happy to <a href="https://airy.co/get-a-demo">help</a>!</p><!--kg-card-begin: html--><button class="cta-btn" onclick="window.location.href=&apos;https://github.com/airyhq/airy&apos;;"> Support us by giving us a Star &#x2B50; on Github </button>

<style>
	.cta-btn {
		background: #4BB4FD;
        color: white;
		font-size: 20px;
		border-radius: 8px;
		padding: 12px 28px;
    }
</style>
<!--kg-card-end: html-->]]></content:encoded></item><item><title><![CDATA[Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform]]></title><description><![CDATA[Create an open-sourced, fully-featured, production-ready conversational platform in the cloud, using open-source tools running on Kubernetes.]]></description><link>https://blog.airy.co/enterprise-grade-conversational-ai-platform/</link><guid isPermaLink="false">619b9b4ad1b810052e0fee66</guid><dc:creator><![CDATA[Ljupco Vangelski]]></dc:creator><pubDate>Mon, 29 Nov 2021 23:45:44 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/11/pexels-pixabay-270360.jpeg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/11/pexels-pixabay-270360.jpeg" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"><p></p><p>With conversational use cases on the rise and Conversational AI becoming more and more relevant, &#xA0;the need for an open-source Conversational AI platform is clear.<br><br>But what is part of such a conversational platform? How do you set it up, host it and run it? How do you connect it to your Conversational AI?<br><br>This post will outline how to create a working conversational platform in the cloud, using open source tools running on Kubernetes.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/11/intro-block-1.png" class="kg-image" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy" width="1424" height="405" srcset="https://blog.airy.co/content/images/size/w600/2021/11/intro-block-1.png 600w, https://blog.airy.co/content/images/size/w1000/2021/11/intro-block-1.png 1000w, https://blog.airy.co/content/images/2021/11/intro-block-1.png 1424w" sizes="(min-width: 720px) 720px"></figure><!--kg-card-begin: markdown--><p>The building blocks of our conversational stack will be:</p>
<ul>
<li><strong>Kubernetes</strong> - The system which will hold all of the created resources.</li>
<li><strong>Apache Kafka</strong> - Message streaming &amp; queuing framework, as a base for Airy.</li>
<li><strong>Airy</strong> - Our open source conversational platform, responsible for transporting messages, streaming them from and to the different conversational channels.</li>
<li><strong>Rasa</strong> - Open source conversational AI, responsible for analyzing the message content and creating an automated response message.</li>
<li><strong>A conversational channel: Airy&apos;s open source Live Chat plugin</strong> - An example conversational channel which is connected to the Airy platform and a live chat plugin that can be embedded into any webpage to chat with website visitors.</li>
</ul>
<p>By the end of the tutorial you will have a Kubernetes cluster with two namespaces, one each for Airy and Rasa. You will also have a chat plugin as a source for Airy, where people can write you messages and then get customized automated replies. The chat plugin can be tested on its own, but can also be added directly to your website. In the <code>What is next?</code> section, you will be presented with other options to further expand your conversational platform.</p>
<p>Let&apos;s first go over some questions on the architectural choices of your conversational platform:</p>
<p><strong>Why do we need a streaming platform?</strong><br>
Fetching messages and conversations from a conversational channel, or source as we call it, can be accomplished in various ways. However, if we want to be able to connect multiple sources, scale the platform in the future as the volume of messages increases, and process the messages in real-time and in a specific order - a streaming platform such as Kafka is the best choice.<br>
This way, we leverage the capabilities of a modern streaming platform to simplify the design and the message processing of our microservices. Instead of them talking to each other, they will communicate exclusively through Kafka.</p>
<p><strong>Why Kubernetes?</strong><br>
All the apps and components we use as building blocks of our conversational stack can be started without Kubernetes. However, using a modern orchestration system, such as Kubernetes, that takes care of scheduling, monitoring and load balancing the services, has huge advantages. Kubernetes eases the challenges of running in production, continuous upgrades, deployments in different availability zones for high-availability, and scaling. An additional important advantage is that it can easily be migrated between different cloud providers.</p>
<p><strong>Airy &amp; Rasa: Why two separate systems and how do they work together?</strong><br>
Both systems do a perfect job in their primary functionality: Rasa the open source conversational AI and Airy as the open source conversational platform.  Airy can handle multiple different sources and heavy message loads, while Rasa can be optimized and trained to provide the best possible automated reply, regardless of where the message comes from.</p>
<p><strong>Let&apos;s get started!</strong></p>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: markdown--><h1 id="1-create-a-kubernetes-cluster">1. Create a Kubernetes cluster</h1>
<p>To begin, you should choose your cloud provider. <code>Airy</code> and <code>Rasa</code> both run on Kubernetes, so it doesn&apos;t make any difference which provider you choose in terms of functionality. The only thing you need is to set up a Kubernetes cluster in your provider and gain access to it. We recommend to choose a fully managed Kubernetes cluster offered by one of the major cloud providers, but of course you are also welcome to manage it on your own or use an existing Kubernetes cluster.</p>
<p>Before you proceed, make sure that you have the <a href="https://helm.sh/docs/intro/install/">Helm</a> and <a href="https://kubernetes.io/docs/tasks/tools/">Kubectl</a> binaries installed on your local machine.</p>
<p>If you already have a deployed Kubernetes cluster, you can <a href="#deploy-airy">proceed to the next section</a>.</p>
<p>We have prepared brief instructions on how to create a new Kubernetes cluster in different cloud environments:</p>
<ul>
<li><a href="#create-kubernetes-gcp">Google Cloud Platform</a></li>
<li><a href="#create-kubernetes-azure">Microsoft Azure</a></li>
<li><a href="#create-kubernetes-do">DigitalOcean</a></li>
<li><a href="#create-kubernetes-aws">Amazon Web Services</a></li>
</ul>
<!--kg-card-end: markdown--><!--kg-card-begin: html--><a id="create-kubernetes-gcp"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><h2 id="11-create-a-kubernetes-cluster-in-gcp">1.1. Create a Kubernetes cluster in GCP</h2>
<p>To create a Kubernetes cluster in Google Cloud, you can use either the <a href="https://console.cloud.google.com">Google Cloud Dashboard</a> or the <a href="https://cloud.google.com/sdk/docs/install">gcloud</a> command line tool, which is part of the Google SDK.</p>
<p>If you prefer the Google Cloud Dashboard, click on <code>Kubernetes engine</code> -&gt; <code>Clusters</code> -&gt; <code>Create</code>.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/gcp-success-1.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>If you prefer the command line option, you should first install the <a href="https://cloud.google.com/sdk">Google Cloud SDK</a> and set up your Google Cloud account for a GCP project. Then run the following command to create a Kubernetes cluster:</p>
<pre><code class="language-sh">gcloud container clusters create airy-rasa-conversational-platform --num-nodes=2 --machine-type=e2-standard-4
</code></pre>
<p>After a few minutes, the details of the created Kubernetes cluster will be printed on the command line:</p>
<p><img src="https://blog.airy.co/content/images/2021/11/success-helm-2.jpg" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>The command will also update your <code>kubeconfig</code> file.</p>
<p>For more information, refer to the <a href="https://cloud.google.com/kubernetes-engine/docs/quickstart">official Google Guide</a>.</p>
<p>Then you can <a href="#deploy-airy">proceed to the next section</a>.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: html--><a id="create-kubernetes-azure"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><h2 id="12-create-a-kubernetes-cluster-in-azure">1.2. Create a Kubernetes cluster in Azure</h2>
<p>For creating a Kubernetes cluster on Microsoft Azure, you can use the <a href="https://portal.azure.com">Microsoft Azure Portal</a>, the <a href="https://docs.microsoft.com/en-us/powershell/azure/get-started-azureps">Azure PowerShell utility</a> or the <a href="https://docs.microsoft.com/en-us/cli/azure/install-azure-cli">Azure CLI</a>.</p>
<p>The simplest way to create the cluster is using the Microsoft Azure Portal. Navigate to the <code>Kubernetes services</code> dashboard and click on <code>Create</code> -&gt; <code>Create a Kubernetes cluster</code>.</p>
<p>On the following screen make sure that you:</p>
<ul>
<li>Select the default resource group or create a new one.</li>
<li>Fill in the name of the cluster (ex. airy-rasa-conversational-platform).</li>
<li>Select the number of nodes.</li>
</ul>
<p><img src="https://blog.airy.co/content/images/2021/11/azure-success-1.jpg" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>After the cluster is created, you can use the <code>az</code> Azure CLI to setup access to the cluster:</p>
<pre><code class="language-sh">az login
az aks list
az aks get-credentials --resource-group DefaultResourceGroup-EUS --name airy-rasa-conversational-platform
</code></pre>
<p>The last command will update your <code>kubeconfig</code> file with the proper credentials.</p>
<p>For more information refer to the <a href="https://docs.microsoft.com/en-us/azure/aks/kubernetes-walkthrough">official Microsoft Guide</a>.</p>
<p>Then you can <a href="#deploy-airy">proceed to the next section</a>.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: html--><a id="create-kubernetes-do"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><h2 id="13-create-a-kubernetes-cluster-in-digitalocean">1.3. Create a Kubernetes cluster in DigitalOcean</h2>
<p>A Kubernetes cluster can be created directly on the <a href="https://cloud.digitalocean.com/kubernetes/clusters">DigitalOcean dashboard</a> by clicking <code>Create</code> -&gt; <code>Kubernetes</code>. You can leave all the options default, except for the <code>Node plan</code> as the default nodes might be too small for running the <code>Airy Core</code> platform.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/do-success-1.jpg" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>After you create the cluster you need to go through a short guided cluster setup.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/do-success-2.jpg" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>After you complete the setup you can <code>Download Config File</code> to save the <code>kubeconfig</code> file to your machine (./kube.conf). With the <code>kubeconfig</code> file you can now access the kubernetes cluster.</p>
<pre><code class="language-sh">kubectl --kubeconfig ./kube.conf get pods
</code></pre>
<p>For more information refer to the <a href="https://docs.digitalocean.com/products/kubernetes/quickstart/">official DigitalOcean Guide</a></p>
<p>Then you can <a href="#deploy-airy">proceed to the next section</a>.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: html--><a id="create-kubernetes-aws"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><h2 id="14-create-a-kubernetes-cluster-in-aws">1.4. Create a Kubernetes cluster in AWS</h2>
<p>A Kubernetes cluster can be created on AWS using the <a href="https://console.aws.amazon.com/">AWS Console</a> or the <a href="https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html">AWS CLI</a>.</p>
<p>For creating a cluster you first need an AWS IAM Role and a VPC. Export your profile and your region:</p>
<pre><code class="language-sh">export AWS_PROFILE=my-aws-profile
export AWS_REGION=my-aws-region
</code></pre>
<p>Create a new AWS IAM Role and attach the appropriate policies:</p>
<pre><code class="language-sh">export POLICY=&apos;{&quot;Version&quot;: &quot;2012-10-17&quot;,&quot;Statement&quot;: [{&quot;Effect&quot;: &quot;Allow&quot;,&quot;Principal&quot;: {&quot;Service&quot;: &quot;eks.amazonaws.com&quot;},&quot;Action&quot;: &quot;sts:AssumeRole&quot;}]}&apos;
</code></pre>
<pre><code class="language-sh">aws iam create-role --role-name airy-rasa --assume-role-policy-document &quot;$POLICY&quot;
</code></pre>
<pre><code class="language-sh">aws iam attach-role-policy --role-name airy-rasa --policy-arn \
  &quot;arn:aws:iam::aws:policy/AmazonEKSClusterPolicy&quot;
</code></pre>
<pre><code class="language-sh">aws iam attach-role-policy --role-name airy-rasa --policy-arn \
  &quot;arn:aws:iam::aws:policy/AmazonEKSWorkerNodePolicy&quot;
</code></pre>
<pre><code class="language-sh">aws iam attach-role-policy --role-name airy-rasa --policy-arn \
  &quot;arn:aws:iam::aws:policy/AmazonEC2ContainerRegistryReadOnly&quot;
</code></pre>
<pre><code class="language-sh">aws iam attach-role-policy --role-name airy-rasa --policy-arn \
  &quot;arn:aws:iam::aws:policy/AmazonEKS_CNI_Policy&quot;
</code></pre>
<pre><code class="language-sh">ROLE_ARN=$(aws iam get-role --role-name airy-rasa --query &apos;Role.Arn&apos; --output text)
</code></pre>
<p>Get the default VPC and the public subnets:</p>
<pre><code class="language-sh">VPC_ID=$(aws ec2 describe-vpcs --filters Name=is-default,Values=true --query &apos;Vpcs[0].VpcId&apos; --output text)
</code></pre>
<pre><code class="language-sh">SUBNETS=$(aws ec2 describe-subnets --filters Name=vpc-id,Values=${VPC_ID} \
  --query &apos;Subnets[?MapPublicIpOnLaunch==`true`].SubnetId&apos; --output text | sed &apos;s/\t/,/g&apos;)
</code></pre>
<p>You can modify the list of subnets according to your needs, but you must have at least two subnets with the property <code>MapPublicIpOnLaunch</code> set to true.</p>
<p>Then create the Kubernetes cluster with the following command:</p>
<pre><code class="language-sh">aws eks create-cluster --name airy-rasa-conversational-platform --role-arn ${ROLE_ARN} --resources-vpc-config subnetIds=${SUBNETS}
</code></pre>
<p>To update your <code>kubeconfig</code> file run:</p>
<pre><code class="language-sh">aws eks update-kubeconfig --name airy-rasa-conversational-platform --alias airy-rasa-conversational-platform
</code></pre>
<p>For more information refer to the <a href="https://docs.aws.amazon.com/eks/latest/userguide/getting-started-console.html">official AWS Guide</a>.</p>
<p>Then you can <a href="#deploy-airy">proceed to the next section</a>.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: html--><a id="deploy-airy"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><h1 id="2-deploy-airy">2. Deploy Airy</h1>
<p>Airy is an open-source, fully-featured, production-ready conversational platform. Airy comes with all the components you need for all conversational use cases, from connectors to different conversational channels and sources, APIs to access your data and UIs from dashboards to an inbox.</p>
<p>To deploy <code>Airy</code> to the created Kubernetes cluster, we will use the <a href="https://airy.co/docs/core/getting-started/installation/helm">Helm installation method</a>.</p>
<p>Deploy Airy with the latest version. You can also configure a specific version.</p>
<pre><code class="language-sh">VERSION=$(curl -L -s https://airy-core-binaries.s3.amazonaws.com/stable.txt)
</code></pre>
<p>If you are not using the default KUBECONFIG file (~/.kube/config) make sure you export the correct one before proceeding (in this case ./kube.conf):</p>
<pre><code class="language-sh">export KUBECONFIG=./kube.conf
</code></pre>
<p>Install the Airy helm chart:</p>
<pre><code>helm install airy https://helm.airy.co/charts/airy-${VERSION}.tgz --timeout 10m
</code></pre>
<p>Once the helm chart installs, you can confirm that all the pods are running correctly with the <code>kubectl get pods</code> command.<img src="https://blog.airy.co/content/images/2021/11/success-get-pods.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>By default, <code>Airy</code> only creates a HTTP listener. When you run Airy in a cloud environment outside of just testing it for the purpose of this guide, it is highly recommended to set up HTTPS for your instance. Refer to the <a href="#next-steps">next steps section</a> for more information.</p>
<p>Get the IP address of your loadBalancer:</p>
<pre><code class="language-sh">kubectl -n kube-system get service  ingress-nginx-controller -o jsonpath=&apos;{.status.loadBalancer.ingress[0].*}{&quot;\n&quot;}&apos;
</code></pre>
<p><img src="https://blog.airy.co/content/images/2021/11/success-lb-1.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>After you have created your <code>Airy</code> instance, you can access the web UI through: <code>http://{the-loadbalancer-ip-address}</code>.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/success-ui.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p><a href="https://join.slack.com/t/airy-developers/shared_invite/zt-ijzwq90x-QT1D75BtaxK3JWehsB_FEg"><img src="https://blog.airy.co/content/images/2021/11/slack.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></a></p>
<h1 id="3-set-up-the-chat-plugin">3. Set up the Chat plugin</h1>
<p>Next, we will create and test one conversational source.</p>
<p>To create a Chat plugin source, navigate to <code>Channels</code> and click the <code>+</code> button. We will call the source <code>Letstalk</code>.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/chatplugin-1.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h1 id="4-deploy-rasa">4. Deploy Rasa</h1>
<p>Rasa is an open source machine learning framework for automated text and voice-based conversations.</p>
<p>We will install <code>Rasa open source</code> in a separate namespace, so that there is a clear distinction between the resources created by the two platforms.</p>
<p>To run <code>Rasa</code> alongside <code>Airy</code> we need a specific configuration and an <a href="https://github.com/airyhq/rasa-x-demo/blob/main/channels/airy.py">Airy python connector</a> which we will load in <code>Rasa</code> to receive the messages from <code>Airy</code> and send them back to the conversational platform.</p>
<p>Before we run the <code>Rasa</code> deployment we will create some configuration in ConfigMaps, so that we can easily change them on the fly later, without needing to build a specific <code>Rasa</code> image.</p>
<p>Create the <code>rasa</code> namespace.</p>
<pre><code class="language-sh">mkdir rasa
cat &lt;&lt;EOF &gt; rasa/rasa.yaml
apiVersion: v1
kind: Namespace
metadata:
  name: rasa
EOF
kubectl apply -f rasa/rasa.yaml
</code></pre>
<p>Now we will checkout the demo repo to load the <code>Airy module</code> and create the proper configuration:</p>
<pre><code class="language-sh">git clone https://github.com/airyhq/rasa-x-demo.git rasa/rasa
</code></pre>
<pre><code class="language-sh">kubectl -n rasa create configmap actions --from-file=rasa/rasa/actions/
</code></pre>
<pre><code class="language-sh">kubectl -n rasa create configmap channels --from-file=rasa/rasa/channels/
</code></pre>
<pre><code class="language-sh">kubectl -n rasa create configmap data --from-file=rasa/rasa/data/
</code></pre>
<pre><code class="language-sh">kubectl -n rasa create configmap config --from-file=rasa/rasa/domain.yml --from-file=rasa/rasa/config.yml
</code></pre>
<p>We need to tell <code>Rasa</code> where to reply to the messages, therefore we will create a ConfigMap where we will put the <code>Airy</code> endpoint for receiving messages (the replies coming from Rasa to the end customer).</p>
<pre><code class="language-sh">cat &lt;&lt;EOF &gt; rasa/credentials.yml
channels.airy.AiryInput:
  api_host: &quot;http://ingress-nginx-controller.kube-system&quot;
  system_token: &quot;demo&quot;
EOF
</code></pre>
<pre><code class="language-sh">kubectl -n rasa create configmap airy-config --from-file=rasa/credentials.yml
</code></pre>
<p>We will amend the <code>rasa/rasa.yaml</code> file with the deployment and the service.</p>
<pre><code>cat &lt;&lt;EOF &gt;&gt; rasa/rasa.yaml
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
  labels:
    app: rasa
  name: rasa
  namespace: rasa
spec:
  serviceName: rasa
  replicas: 1
  selector:
    matchLabels:
      app: rasa
  template:
    metadata:
      labels:
        app: rasa
    spec:
      initContainers:
      - name: rasa-training
        image: rasa/rasa:2.6.3-full
        args:
          - train
        securityContext:
          runAsUser: 0
        volumeMounts:
        - mountPath: /app/data
          name: data
        - mountPath: /app/domain.yml
          name: config
          subPath: domain.yml
        - mountPath: /app/config.yml
          name: config
          subPath: config.yml
        - mountPath: /app/models
          name: models
      containers:
      - env:
        image: rasa/rasa:2.6.3-full
        args:
          - run
          - --enable-api
          - -vv
        securityContext:
          runAsUser: 0
        imagePullPolicy: Always
        name: rasa
        volumeMounts:
        - mountPath: /app/actions
          name: actions
        - mountPath: /app/channels
          name: channels
        - mountPath: /app/data
          name: data
        - mountPath: /app/domain.yml
          name: config
          subPath: domain.yml
        - mountPath: /app/config.yml
          name: config
          subPath: config.yml
        - mountPath: /app/credentials.yml
          name: airy-config
          subPath: credentials.yml
        - mountPath: /app/models
          name: models
      volumes:
      - configMap:
          name: actions
        name: actions
      - configMap:
          name: channels
        name: channels
      - configMap:
          name: config
        name: config
      - configMap:
          name: data
        name: data
      - configMap:
          name: airy-config
        name: airy-config
  volumeClaimTemplates:
  - metadata:
      name: models
    spec:
      accessModes: [ &quot;ReadWriteOnce&quot; ]
      resources:
        requests:
          storage: 5Gi
---
apiVersion: v1
kind: Service
metadata:
  name: rasa
  namespace: rasa
spec:
  ports:
  - name: web
    port: 80
    protocol: TCP
    targetPort: 5005
  selector:
    app: rasa
  type: ClusterIP
EOF
</code></pre>
<p>Create those resources in your Kubernetes cluster:</p>
<pre><code>kubectl apply -f rasa/rasa.yaml
</code></pre>
<p>We are using the a generic <code>rasa</code> docker image, however we have some pre-defined data which we are using for training. The training of the <code>Rasa</code> automatic replies bot is done with an <code>initContainer</code>. We are training the bot to answer a simple scenario where the user would:</p>
<ul>
<li>Send a greeting (Hi / Hello)</li>
<li>Inquire about the working hours (What times are you open?)</li>
<li>Send a goodbye (Bye / Goodbye)</li>
</ul>
<p>To configure multiple scenarios and rules, you can edit the <code>data</code> configmap and the <code>domain.yml</code> file in the <code>config</code> configMap.</p>
<pre><code class="language-sh">kubectl -n rasa edit configmap data
</code></pre>
<pre><code class="language-sh">kubectl -n rasa edit configmap config
</code></pre>
<p>After changing the configuration, the <code>rasa</code> pod needs to be restarted which will trigger a new training.</p>
<pre><code class="language-sh">kubectl -n rasa delete pod -l app=rasa
</code></pre>
<p>We are using a <code>StatefulSet</code> workload for <code>Rasa</code> to keep the training data persistent. To see the logs from the training pod run:</p>
<pre><code class="language-sh">kubectl -n rasa logs rasa-0 -c rasa-training
</code></pre>
<p>To watch for logs from the <code>Rasa</code> main process run:</p>
<pre><code class="language-sh">kubectl -n rasa logs rasa-0 -f
</code></pre>
<p>For a better understanding of how <code>Rasa</code> works, refer to the <a href="https://rasa.com/docs/rasa/playground">official documentation</a>.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h1 id="5-integrate-airy-and-rasa">5. Integrate Airy and Rasa</h1>
<p>At this moment both <code>Airy</code> and <code>Rasa</code> should be installed and running. The <code>Rasa</code> platform is aware what messages to expect through the <code>airy</code> channel and knows where to respond to the relevant messages. Next, we need to tell the <code>Airy</code> platform where relay messages to as they come in.</p>
<p>We will use the <code>integration/webhook</code> component of <code>Airy</code> to send the messages to <code>Rasa</code>.</p>
<p>Let&apos;s create a configMap to start the <code>integration/webhook</code> component</p>
<pre><code class="language-sh">kubectl create configmap integration-webhook --from-literal=name=webhook --from-literal=maxBackoff=10
</code></pre>
<p>The <code>airy-controller</code> watches for <code>componentType-componentName</code> configMaps and will then start the appropriate component. In this case, the webhook pods will start in the Kubernetes cluster. A confirmation that the pods are running can be made with the following command:</p>
<pre><code class="language-sh">kubectl get pods -l &apos;app in (webhook-publisher,webhook-consumer)&apos;
</code></pre>
<p>Now we instruct the webhook component where to send the messages when they come in:</p>
<pre><code class="language-sh">curl -X POST -H &apos;Content-Type: application/json&apos; http://{the-loadbalancer-ip-address}/webhooks.subscribe --data &apos;{&quot;url&quot;:&quot;http://rasa.rasa/webhooks/airy/webhook&quot;}&apos;
</code></pre>
<p>Note that <code>Airy</code> and <code>Rasa</code> talk through the internal Kubernetes services, as they are running in the same Kubernetes cluster.</p>
<!--kg-card-end: markdown--><!--kg-card-begin: markdown--><h1 id="6-test-everything">6. Test everything</h1>
<p>To test that everything is working properly, we will simulate writing messages to the chat plugin source as if they come from actual end customers.</p>
<p>The URL for the chat plugin client demo can be found at:<br>
<code>http://{the-loadbalancer-ip-address}/chatplugin/ui/example?channel_id={chatplugin_channel_id}</code></p>
<p>To get <code>the chatplugin_channel_id</code> on the channel which you created, you should go to <code>Channels</code> -&gt; <code>Click on the channel</code> -&gt; <code>Edit</code> -&gt; <code>Install &amp; Customize</code> and then copy the <code>w[n].channelId</code> property as on the next screenshot.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/channel-id.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>Then write to the channel and you should get automated replies.</p>
<p>For example:</p>
<pre><code>Hello
What times are you open?
Bye
</code></pre>
<p>On the following screencast we have opened two screens in parallel: the first screen (on the left) with the Airy Inbox where all the messages from all the channels are coming in and the second screen (on the left) where we have the chat plugin client demo where we can write to the chat plugin as an end customer.</p>
<p><img src="https://blog.airy.co/content/images/2021/11/sc-2.gif" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform" loading="lazy"></p>
<p>There we can see the automated replies from the <code>Rasa</code> automation.</p>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: html--><a id="next-steps"></a><!--kg-card-end: html--><!--kg-card-begin: markdown--><p><strong>What is next?</strong></p>
<p>Congratulations! You are a conversational developer now.</p>
<ul>
<li>
<p><strong>Join the community</strong> - Chat with other conversational engineers and stay up to date on the latest developments - join our <a href="https://join.slack.com/t/airy-developers/shared_invite/zt-ijzwq90x-QT1D75BtaxK3JWehsB_FEg">Slack community</a>.</p>
</li>
<li>
<p><strong>Secure your installation</strong> - The demo doesn&apos;t offer any encryption, so before you connect any actual sources - you should secure your <code>Airy Core</code> installation - <a href="https://airy.co/docs/core/getting-started/installation/security">https://airy.co/docs/core/getting-started/installation/security</a></p>
</li>
<li>
<p><strong>Connect multiple channels</strong> - <code>Airy</code> supports a variety of sources such as Facebook, Google Business Messenger, WhatsApp and SMS. Refer to the Sources documentation page to see how to connect them - <a href="https://airy.co/docs/core/sources/introduction">https://airy.co/docs/core/sources/introduction</a></p>
</li>
<li>
<p><strong>Expand the capabilities of Rasa</strong> - <code>Rasa</code> is a very powerful tool and in <a href="https://blog.airy.co/how-to-level-up-customer-support-with-airy-rasa-x/">one of our previous blog posts</a> we explain how to get the most out of your conversational AI and customer support by having your AI suggest likely responses and then training it on the selection that customer support makes.</p>
</li>
</ul>
<!--kg-card-end: markdown--><hr><!--kg-card-begin: markdown--><p><strong>Official websites and documentation:</strong></p>
<!--kg-card-end: markdown--><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://airy.co/docs/core"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Welcome to Airy! | Airy Documentation</div><div class="kg-bookmark-description">Airy Core is an open-source, fully-featured, production-ready</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://airy.co/docs/core/img/favicon.ico" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"><span class="kg-bookmark-author">Airy DocumentationMenuMenu</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://airy.co/docs/core/img/getting-started/introduction-light.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"></div></a><figcaption>Airy Core</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://rasa.com/docs/rasa/"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Introduction to Rasa Open Source</div><div class="kg-bookmark-description">Learn more about open-source natural language processing library Rasa for conversation handling, intent classification and entity extraction in on premise chatbots.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://rasa.com/docs/rasa/5a888fd3027b8711621e241ee63e41b9.png?v=0.8.3" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"><span class="kg-bookmark-author">Rasa Open Source Documentation</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://d33wubrfki0l68.cloudfront.net/a8775ddc53a20c7d3a4fffa38aec1c1855afbacb/294e1/docs/rasa/img/logo-rasa-oss.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"></div></a><figcaption>Rasa Open Source</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://kubernetes.io"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Production-Grade Container Orchestration</div><div class="kg-bookmark-description">Production-Grade Container Orchestration</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://kubernetes.io/favicons/apple-touch-icon-180x180.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"><span class="kg-bookmark-author">Kubernetes</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://kubernetes.io/images/kubernetes-horizontal-color.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"></div></a><figcaption>Kubernetes</figcaption></figure><figure class="kg-card kg-bookmark-card kg-card-hascaption"><a class="kg-bookmark-container" href="https://kafka.apache.org"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Apache Kafka</div><div class="kg-bookmark-description">Apache Kafka: A Distributed Streaming Platform.</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://kafka.apache.org/images/apache_feather.gif" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"><span class="kg-bookmark-author">Apache Kafka</span></div></div><div class="kg-bookmark-thumbnail"><img src="http://apache-kafka.org/images/apache-kafka.png" alt="Deploying open-source Airy and Rasa as an enterprise-grade Conversational AI platform"></div></a><figcaption>Kafka</figcaption></figure>]]></content:encoded></item><item><title><![CDATA[Tutorial: Live Chat Plugin Setup]]></title><description><![CDATA[This tutorial shows you, step-by-step, how to successfully set up an Airy Live Chat Plugin on your own instance and customize it to your liking.]]></description><link>https://blog.airy.co/tutorial-chat-plugin-setup/</link><guid isPermaLink="false">60bdefa86f273b7b1bb7f468</guid><category><![CDATA[how-to]]></category><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Liz Hutter]]></dc:creator><pubDate>Wed, 13 Oct 2021 13:36:20 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/10/john-schnobrich-FlPc9_VocJ4-unsplash.jpg" medium="image"/><content:encoded><![CDATA[<h2 id="introduction">Introduction</h2><img src="https://blog.airy.co/content/images/2021/10/john-schnobrich-FlPc9_VocJ4-unsplash.jpg" alt="Tutorial: Live Chat Plugin Setup"><p>Having a Live Chat plugin on your website has become essential. Connect with your website visitors, communicate with them in real time, or use a bot to automate FAQs.</p><p>Airy&#x2019;s Live Chat Plugin comes out of the box fully functioning, and thanks to its open-source nature and React Render Props, you can customize everything about it.</p><p>Out of the box Airy&#x2019;s Live Chat Plugin supports:</p><ul><li>Full customization of look, feel and features</li><li>All message types, including emojis</li><li>Rich Messaging with templates, cards and carousels</li></ul><p></p><h2 id="how-to-create-your-personal-airy-live-chat">How to Create your Personal Airy Live Chat</h2><ol><li>Set up your <a href="https://airy.co/docs/core/getting-started/installation/introduction">Airy instance</a></li><li>Log into your Airy Core UI via your Airy instance and you will be guided to your Inbox</li><li>Click the &#x2018;Channels&#x2019; button on the left sidebar menu</li><li>Click the &apos;+&apos; symbol to the right of the Airy Live Chat option</li></ol><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/LyyOVtHupSKI3ql0U0opEccFbkEtwrM5YGP5aDSzwxv_6drGr_u8Wq8l4rZCGNf2cNs6e78sWI8hXHzp8LD5ywHV2b0KPjeEsLFjpvcCPqELivshDCimdPTUHtqeRV-sTjzws5lt" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><p>5. &#xA0; Click the &#x2018;Connect Airy Live Chat&#x2019; button</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/AJXc-7CsmRZNISKKglpeXi8M0N5-hRXJnpLDaRa7O_4WeYXfTc5f3C8LPZTd8qUNjNuIWeZh1DPvM7LnjD99aVFUsOR173mwawTKfeeTD-D6E5QVP5wL34ynHk0VVcSLcY7bYMK2" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><p>6. &#xA0; You will be prompted to enter a Display Name and an optional Image URL - these will be used for internal purposes only, meaning that you and your team will be the only ones able to see these optional features</p><ul><li>The display name will be used as the <a href="https://airy.co/docs/core/getting-started/glossary/#conversation">conversation</a>&apos;s name and the image URL will be used as the icon within the <a href="https://airy.co/docs/core/ui/inbox">Inbox UI</a></li><li>If you do not select an image, one will be assigned to you</li></ul><p>7. &#xA0; Click the &#x2018;Save&#x2019; button to save the changes you have made</p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/YYqYZl7Atga5916QKcGjahNMdt9FPu5auB9sWOu5cRAvwtLaeFmKuL-ezgkZr4rO1nJ_gkvRKoYMh3GgqA0izhkXm9oRsZ2b_L9kMryT7AqVjLGebnkUeZLUJ0LzyRMEA-yzsjlZ" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><p></p><h2 id="customizing-your-airy-live-chat">Customizing your Airy Live Chat</h2><p>For most use cases, the <a href="https://airy.co/docs/core/sources/chatplugin/customization#basic-customization">basic customization</a> will be sufficient for you and your team because it supports the essential options to customize the experience. These customizations can all be made on one screen, which will be shown . However, if you need full control of every aspect of the plugin, we also offer the possibility of <a href="https://airy.co/docs/core/sources/chatplugin/customization#advanced-customization">advanced customization</a> with Render Props.<br></p><p>Airy&#x2019;s Chat Plugin allows you to customize based on the needs of you, your team and your contacts. These customizations include, but are not limited to:</p><ul><li>Text displayed on your chat plugin</li><li>Color scheme of your chat plugin</li><li>Icon displayed on the button to open your Airy Chat Plugin on your website</li><li>Welcome Message displayed when contacts open your chat plugin</li><li>Height and width of your chat plugin</li><li>Closing options for your Airy Chat Plugin (basic, medium, full)</li><li>Option to disable your chat plugin for mobile browsers</li></ul><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://blog.airy.co/content/images/2021/06/green.gif" width="388" height="708" loading="lazy" alt="Tutorial: Live Chat Plugin Setup"></div><div class="kg-gallery-image"><img src="https://blog.airy.co/content/images/2021/06/tedi.gif" width="388" height="708" loading="lazy" alt="Tutorial: Live Chat Plugin Setup"></div><div class="kg-gallery-image"><img src="https://blog.airy.co/content/images/2021/06/pink-1.gif" width="388" height="708" loading="lazy" alt="Tutorial: Live Chat Plugin Setup"></div></div></div></figure><p>After you successfully connect Airy Live Chat, you can customize it by clicking on your connected Airy Live Chat.</p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/3st6SfGPuqH4lnFE0zVzW8m0878FsuUNf_T7EJYfznefXHNyc6ui68kbJGnTbHDUCNHaOr23obO_hRZwpRCr-SvJe-ftJE5sjilDNY5gYO57vnvbxPeIYQ9nxp7IjRjI6acd50mB" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><p>Once you click the &#x2018;Edit&#x2019; link to the right of your connected Airy Live Chat channel, you can navigate to click the &#x2018;Install &amp; Customize&#x2019; option.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/iKdB_RkZ03x5u5GpOemTl1csc5JrqWHsJXkYnJnVR22yrFfaSqkL9hBMsma5ADb2wum0EWsKX7yi3AAd3LniYOIGeGIoeb1bITDWSUxTnWwQp713HJzyo6eHiUqSbVnos_elOpwo" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><p>Here, you are able to customize your Live Chat plugin. You are able to customize several features, such as:</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/10/Screenshot-2021-10-13-at-15.23.05.png" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy" width="632" height="865" srcset="https://blog.airy.co/content/images/size/w600/2021/10/Screenshot-2021-10-13-at-15.23.05.png 600w, https://blog.airy.co/content/images/2021/10/Screenshot-2021-10-13-at-15.23.05.png 632w"></figure><p>You can see an example of how the code will look once you customize these features in the image below.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/10/Screenshot-2021-10-11-at-15.57.23.png" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy" width="2000" height="1630" srcset="https://blog.airy.co/content/images/size/w600/2021/10/Screenshot-2021-10-11-at-15.57.23.png 600w, https://blog.airy.co/content/images/size/w1000/2021/10/Screenshot-2021-10-11-at-15.57.23.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/10/Screenshot-2021-10-11-at-15.57.23.png 1600w, https://blog.airy.co/content/images/2021/10/Screenshot-2021-10-11-at-15.57.23.png 2199w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/10/Screenshot-2021-10-11-at-15.57.18.png" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy" width="2000" height="1250" srcset="https://blog.airy.co/content/images/size/w600/2021/10/Screenshot-2021-10-11-at-15.57.18.png 600w, https://blog.airy.co/content/images/size/w1000/2021/10/Screenshot-2021-10-11-at-15.57.18.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/10/Screenshot-2021-10-11-at-15.57.18.png 1600w, https://blog.airy.co/content/images/size/w2400/2021/10/Screenshot-2021-10-11-at-15.57.18.png 2400w" sizes="(min-width: 720px) 720px"></figure><h2 id="optional-welcome-template">Optional: Welcome Template</h2><p>If you would prefer to begin all conversations with the option of a template with buttons, you must include this code within the &lt;script&gt;&lt;/script&gt; portion of the code snippet.</p><p>An example of the text including a hardcoded Start/Welcome template will look something like this:</p><pre><code>&lt;script&gt;(function(w, d, s, n) {
       w[n] = w[n] || {};
       w[n].channelId = &quot;your instance&#x2019;s Airy Live Chat unique channelID&quot;;
       w[n].host = &quot;https://OURCOMPANY.airy.co&quot;;
       w[n].config = {
         headerTextColor: &quot;#474545&quot;,
         primaryColor: &quot;#91B668&quot;,
         accentColor: &quot;#91B668&quot;,
         backgroundColor: &quot;#E9E8E8&quot;, 
         showMode: false,
         headerText: &quot;COMPANY Chat&quot;,
         bubbleIcon: &quot;https://www.OURCOMPANY.com/site/img/logo.png&quot;,
         welcomeMessage: {
           fallback: &quot;Welcome to OUR COMPANY! How can we help you today?&quot;,
           richCard: {
             standaloneCard: {
               cardContent: {
                 media: {
                   height: &quot;MEDIUM&quot;,
                   contentInfo: {
                     altText:
                       &quot;Welcome to OUR COMPANY! How can we help you today?&quot;,
                     fileUrl: &quot;https://airy-platform-media.s3.amazonaws.com/CHANNEL-ID&quot;,
                   },
                 },
                 title:
                   &quot;Welcome to OUR COMPANY! How can we help you today?&quot;,
                 description: &quot;I have a question about:&quot;,
                 suggestions: [
                   {
                     reply: {
                       text: &quot;Before Visiting Our Store&quot;,
                       postbackData: &quot;COMPANY-beforevisit&quot;,
                     },
                   },
                   {
                     reply: {
                       text: &quot;Live Shows &amp; Livestreams&quot;,
                       postbackData: &quot;COMPANY-shows-livestreams&quot;,
                     },
                   },
                   {
                     reply: {
                       text: &quot;Customer Service&quot;,
                       postbackData: &quot;COMPANY-customerservice&quot;,
                     },
                   },
                   {
                     reply: {
                       text: &quot;News, Offers &amp; Discounts&quot;,
                       postbackData: &quot;COMPANY-news&quot;,
                     },
                   },
                 ],
               },
             },
           },
         },
       };
       var f = d.getElementsByTagName(s)[0],
       j = d.createElement(s);
       j.async = true;
       j.src = w[n].host + &quot;/chatplugin/ui/s.js&quot;;
       f.parentNode.insertBefore(j, f);
     })(window, document, &quot;script&quot;, &quot;airy&quot;);&lt;/script&gt;</code></pre><p>If you choose to begin with this template, it can look something like this:</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/hi_-CNHQm1cI7Wln7ks_LgwwQwbB5q-Eiv76TQzwVBhts1ip9p-bOjwq6i0OLwG8N7FdWP1vQD3fM0nj-2nyqiSf4YjlFFZtw5KNS5l-1U18X7GJDWENj1d_17avztqoWLNn9tK4" class="kg-image" alt="Tutorial: Live Chat Plugin Setup" loading="lazy"></figure><h2 id="where-to-put-the-code">Where to Put the Code</h2><p>You will copy all code within &lt;script&gt;&lt;/script&gt; and paste it in the head of your website.</p>]]></content:encoded></item><item><title><![CDATA[Airy 0.28.0: Helm Charts reorg, easier Security Setup, and more]]></title><description><![CDATA[This release lays the foundation for future improvements to your Airy instance, as well as facilitating an easier way to secure your Airy Core installation. It also adds some flare to the user experience within the UI by enabling emoji reactions to messages from Facebook users.]]></description><link>https://blog.airy.co/airy-release-0-28-0-helm-charts-security-setup-https/</link><guid isPermaLink="false">6114ef2262c216045e0a95e0</guid><category><![CDATA[Product Updates]]></category><dc:creator><![CDATA[Liz Hutter]]></dc:creator><pubDate>Thu, 12 Aug 2021 10:35:00 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/08/photo-1618060932014-4deda4932554.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/08/photo-1618060932014-4deda4932554.jpg" alt="Airy 0.28.0: Helm Charts reorg, easier Security Setup, and more"><p>This release lays the foundation for future improvements to your Airy instance, as well as facilitating an easier way to secure your Airy Core installation. It also adds some flare to the user experience within the UI by enabling emoji reactions to messages from Facebook users.</p><h3 id="reorganizing-helm-charts">Reorganizing Helm Charts</h3><p>So far during the setup of Airy Core, we had our own helm container image, where we also bundled the helm charts for the installation. In the latest release, we changed this approach in a way that we are now using a generic alpine/helm image and publishing the individual helm charts in an S3 bucket.</p><p>This approach will help us with several infrastructure features, which we will continue to work on for the next few releases:</p><ul><li>Creating a generic helm installation tutorial for people who have their own Kafka cluster, have their own Kubernetes cluster and/or would like to use their own Kubernetes ingress controller</li><li>Make it possible for people to create multiple Airy Core installations in a single Kubernetes cluster</li><li>Create the <code>airy upgrade</code> command which will take care of upgrading an existing installation of Airy Core</li></ul><h3 id="https-inside-kubernetes">HTTPS inside Kubernetes</h3><p>When deploying Airy Core on AWS or in the cloud in general, using HTTPS is a must. Previously, we provided a way in our docs to add your own certificate to the AWS Certificates Manager and patch the service and ingress controller in your Airy Core in order for the LoadBalancer to use the certificate.</p><p>In the latest release however, we have included an optional ingress controller which has Let&apos;s Encrypt functionality bundled. With this option, valid HTTPS certificates are created and renewed automatically.</p><p>There are two prerequisites for enabling Let&apos;s Encrypt in your Airy Core instance:</p><ul><li>The fully qualified domain name (FPDN) or the public hostname on which Airy Core will be accessible</li><li>An e-mail address, required by Let&apos;s Encrypt in order to generate the certificate</li></ul><p>When a new Airy Core instance is created, you can add these parameters to the <code>airy.yaml</code> file and then follow the Airy Core <a href="https://airy.co/docs/core/getting-started/installation/aws#https-using-lets-encrypt">documentation</a> on how to complete the installation and the setup.</p><h3 id="adding-facebook-emoji-reaction-metadata">Adding Facebook emoji reaction metadata</h3><p>This release introduced an exciting feature for Airy users who have Facebook Business Pages connected to their instance: adding Facebook emoji reaction metadata. With this improvement, we are informed when Facebook users react to messages with emojis; for example: a heart or thumbs up emoji. This feature can be used to optimize our connection with customers and get real-time feedback.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/08/Screenshot-2021-08-11-at-16.32.02.png" class="kg-image" alt="Airy 0.28.0: Helm Charts reorg, easier Security Setup, and more" loading="lazy" width="928" height="228" srcset="https://blog.airy.co/content/images/size/w600/2021/08/Screenshot-2021-08-11-at-16.32.02.png 600w, https://blog.airy.co/content/images/2021/08/Screenshot-2021-08-11-at-16.32.02.png 928w" sizes="(min-width: 720px) 720px"></figure><h3 id="enjoy-and-stay-tuned-for-the-highlights-from-release-0290"><br>Enjoy and stay tuned for the highlights from Release 0.29.0!</h3><p></p>]]></content:encoded></item><item><title><![CDATA[How to build your conversational dashboard]]></title><description><![CDATA[<p>Your conversational experiences are up and running. The livechat on your website is buzzing with chats, your chatbot answers requests in your app and your customer service agents switched from answering phone to answering chats.<br><br>The data is flowing. But how do you visualize it?<br><br><a href="https://blog.airy.co/a-guide-to-conversational-metrics/">First you need to chose</a></p>]]></description><link>https://blog.airy.co/how-to-build-your-conversational-dashboard/</link><guid isPermaLink="false">60fa7d6562c216045e0a957b</guid><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Skander Garroum]]></dc:creator><pubDate>Fri, 23 Jul 2021 08:38:24 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/07/metrics_GA-1.jpeg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/07/metrics_GA-1.jpeg" alt="How to build your conversational dashboard"><p>Your conversational experiences are up and running. The livechat on your website is buzzing with chats, your chatbot answers requests in your app and your customer service agents switched from answering phone to answering chats.<br><br>The data is flowing. But how do you visualize it?<br><br><a href="https://blog.airy.co/a-guide-to-conversational-metrics/">First you need to chose your Standard and Value Metrics</a> and get your dashboard up and running, for example with Airy&#x2019;s Conversational Dashboard, it will look something like this:</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.29.05.png" class="kg-image" alt="How to build your conversational dashboard" loading="lazy" width="2000" height="1057" srcset="https://blog.airy.co/content/images/size/w600/2021/07/Screenshot-2021-07-23-at-10.29.05.png 600w, https://blog.airy.co/content/images/size/w1000/2021/07/Screenshot-2021-07-23-at-10.29.05.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/07/Screenshot-2021-07-23-at-10.29.05.png 1600w, https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.29.05.png 2328w" sizes="(min-width: 720px) 720px"></figure><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.29.24.png" class="kg-image" alt="How to build your conversational dashboard" loading="lazy" width="2000" height="609" srcset="https://blog.airy.co/content/images/size/w600/2021/07/Screenshot-2021-07-23-at-10.29.24.png 600w, https://blog.airy.co/content/images/size/w1000/2021/07/Screenshot-2021-07-23-at-10.29.24.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/07/Screenshot-2021-07-23-at-10.29.24.png 1600w, https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.29.24.png 2284w" sizes="(min-width: 720px) 720px"></figure><p>As you can see we chose our standard metrics: Number of Conversations, Active Conversations by Day and Source, Channel Count and the most active hours.<br><br>Same as our dashboard yours should be extendable, so you can add metrics on the fly and value metrics once you identified them.</p><h2 id="stack-technical-setup">Stack &amp; technical setup</h2><p>A dashboard is only as good as the data and the systems powering it. To make your dashboard extendable and flexible, as you will learn and adapt it over time, you need to make it powerful.<br>For a powerful dashboard, you will need to set up a good data pipeline and storage system. <br><br>As far as data pipelines and storage go, it depends on which channels you have running, which metrics you want to track, and which data you want to store.<br><br>Learning from storing and utilizing millions of conversations our choice for conversational data solutions is data lakes. <br><br>At Airy we build our data pipeline on AWS, storing conversations, messages and metadata in S3, using Parquet for compression and AWS Glue for schema management. </p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/07/Airy_Data_Lake_Graphic_Metadata--1-.png" class="kg-image" alt="How to build your conversational dashboard" loading="lazy" width="1454" height="600" srcset="https://blog.airy.co/content/images/size/w600/2021/07/Airy_Data_Lake_Graphic_Metadata--1-.png 600w, https://blog.airy.co/content/images/size/w1000/2021/07/Airy_Data_Lake_Graphic_Metadata--1-.png 1000w, https://blog.airy.co/content/images/2021/07/Airy_Data_Lake_Graphic_Metadata--1-.png 1454w" sizes="(min-width: 720px) 720px"></figure><p>For the conversational dashboard above we are using Metabase, for us the fastest and easiest way to visualize the analytics. <br><br>As another Open Source solution it also helps keeping your data secure and data control and ownership clear. </p><h2 id="benefits-of-using-airy%E2%80%99s-conversational-dashboard">Benefits of using Airy&#x2019;s Conversational Dashboard</h2><p><strong>- Easy to host</strong><br>Our enterprise edition is easy to host. You can run it in your own cloud, no matter if it is AWS, Google Cloud or Azure.<br><br><strong>- Extendable and flexible</strong><br>As we have seen there are many possible metrics you want to track, and your tracking and dashboard will change over time. Airy&#x2019;s Conversational dashboard is set up exactly for this challenge.<br><br><strong>- Easy to set up</strong><br>Airy&#x2019;s analytics are easy to set up. Either run through our guide or schedule a call with our solutions engineers to get you up and running.<br></p>]]></content:encoded></item><item><title><![CDATA[A guide to conversational metrics]]></title><description><![CDATA[<p>Conversational interfaces are everywhere. Your website visitors use your live chat to ask questions and inform themselves before they buy, your app users use the support chat to get help and your conversational commerce experiences guide potential customers to purchase the right product, one message at a time.<br><br>But how</p>]]></description><link>https://blog.airy.co/a-guide-to-conversational-metrics/</link><guid isPermaLink="false">60fa792262c216045e0a9544</guid><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Skander Garroum]]></dc:creator><pubDate>Fri, 23 Jul 2021 08:20:49 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/07/metricsstock1.jpeg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/07/metricsstock1.jpeg" alt="A guide to conversational metrics"><p>Conversational interfaces are everywhere. Your website visitors use your live chat to ask questions and inform themselves before they buy, your app users use the support chat to get help and your conversational commerce experiences guide potential customers to purchase the right product, one message at a time.<br><br>But how do you measure this new conversational world? <br>Every time a new channel is added to your channel mix and technical stack, the question comes to metrics: what do you measure, how do you measure it and how do you visualize it? <br><br>Of course, you can start with simple channel analytics - which channels are connected and how many conversations and messages go through every day. But this is too high level for the many questions to ask and metrics to answer them: does the app chat support perform better than Whatsapp support, what is used more - by number of conversations but also by number of average user conversations. How do the channels differ in usage - for example what the most active hours per channel are. <br><br>All these questions lead to real decisions: how many support agents do you need to hire or retrain for each channel, when do they need to be most active, how does the channel compare to other channels you have running - does it make sense to invest more in conversational channel A or conversational channel B?<br><br>And we haven&#x2019;t even started looking at operating metrics: NLP analytics like most common intents, phrase clustering or analysing conversation paths with flowcharts. Last but not least don&#x2019;t forget about business metrics like Merchant Response Rates, response times or CSATs. They define how well your conversational experience is actually adopted on the business side.<br></p><h2 id="conversational-metrics-101">Conversational Metrics 101</h2><p><br>There are many conversational metrics to look at and a whole new world of analytics to track, store, and explore. And everybody with analytics experience knows: with great choice comes great responsibility. <br><br>The choice is an important one, thankfully the answer is straight forward:<br>Conversational interfaces are just that: a new type of interface, a useful and universal one. Perhaps the future of universal interfaces. Perhaps soon all your interfaces will be conversational, at least partially, staffed by Conversational AI and human agents, talking to your users like friends and family.<br><br>For this vision to arrive, conversational interfaces need to proof their value first.<br>So for your conversational tracking you should focus on 2 metrics: <br>- Standard Conversational Metrics: to keep track of this new channel and improve it<br>- Value Metrics: to prove the ROI of this channel, focus on the value it brings<br></p><h2 id="an-overview-of-conversational-metrics">An overview of conversational metrics</h2><p><br>But first, let&apos;s look at the standard conversational metrics. For an overview of Conversational Metrics and their definitions, check this handy list of possible metrics:<br><a href="https://airy.co/docs/enterprise/analytics/introduction">https://airy.co/docs/enterprise/analytics/introduction</a> <br></p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.15.00.png" class="kg-image" alt="A guide to conversational metrics" loading="lazy" width="828" height="477" srcset="https://blog.airy.co/content/images/size/w600/2021/07/Screenshot-2021-07-23-at-10.15.00.png 600w, https://blog.airy.co/content/images/2021/07/Screenshot-2021-07-23-at-10.15.00.png 828w" sizes="(min-width: 720px) 720px"></figure><p><br>We divided them into 5 major categories:<br><br><strong>Setup &amp; Channel Analytics</strong><br>Everything you want to know about the conversational channels you have<br><br><strong>Conversational Analytics</strong><br>Metrics for Messages, Conversations and everything in between<br><br><strong>NLP Analytics</strong><br>Intents, Top In &amp; Out Messages and more<br><br><strong>Demographics</strong><br>Understand where your users come from and which languages they speak<br><br><strong>Business Analytics</strong><br>Business Metrics from MRR to CSAT</p><p>Out of these, you choose the standard metrics you want to track. In a second we will show you which ones we chose for our standard analytics setup.</p><h2 id="value-metrics"><br>Value Metrics</h2><p>Value metrics explain themselves with their name: what value comes from this channel? They go one step further than the metrics above and try to capture the ROI of the conversational experience.<br><br>The trick here is that conversational experiences are often multi-intent: Your website visitors use your live chat to ask pre-purchase questions on a product, and are helped by your sales agents.<br>A while later they come back after purchasing something to ask customer service questions, answered by an automated FAQ.<br><br>So is the livechat a sales channel or a customer service channel? Both in this case. But if it is both then you need to track the value metrics for both: The revenue generated through the answering of sales related questions, and the cost savings through the automated FAQs.<br><br>For value metrics thus keep it simple: Track how much revenue is generated if the conversational experience is sales related. <br>If the experience is customer service related, track how many costs are saved, for example by &#x201C;fully automated conversations&#x201D; or &#x201C;questions deflected&#x201D;.<br></p><h2 id="next-steps-storing-and-visualizing-data">Next steps: Storing and visualizing data</h2><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/07/metrics_GA.jpeg" class="kg-image" alt="A guide to conversational metrics" loading="lazy" width="2000" height="1382" srcset="https://blog.airy.co/content/images/size/w600/2021/07/metrics_GA.jpeg 600w, https://blog.airy.co/content/images/size/w1000/2021/07/metrics_GA.jpeg 1000w, https://blog.airy.co/content/images/size/w1600/2021/07/metrics_GA.jpeg 1600w, https://blog.airy.co/content/images/size/w2400/2021/07/metrics_GA.jpeg 2400w" sizes="(min-width: 720px) 720px"></figure><p>Now that you know what you want to track and have chosen your standard metrics &amp; your value metrics, it is time to make working with them a reality. <br>First you need to make sure that you have a data pipeline up and running, for example a conversational Data Lake. Read our guide on conversational data lakes to learn more about how we solved this particular problem.<br><br>Once you have the data and it is readily available you need to find an easy way to visualize it: A conversational dashboard is needed. <br><br>Continue your path to conversational analytics mastery by learning how to build your conversational dashboard.<br></p>]]></content:encoded></item><item><title><![CDATA[How to level up your customer support and AI using Airy and Rasa X]]></title><description><![CDATA[How to get the most out of your conversational AI and customer support by having your AI suggest likely responses and then training it on the selection that customer support makes.]]></description><link>https://blog.airy.co/how-to-level-up-customer-support-with-airy-rasa-x/</link><guid isPermaLink="false">60c348e262c216045e0a90c8</guid><category><![CDATA[nlu]]></category><category><![CDATA[chatbots]]></category><category><![CDATA[how-to]]></category><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Christoph Pröschel]]></dc:creator><pubDate>Thu, 24 Jun 2021 15:39:47 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/06/laptopwithcode.jpeg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/06/laptopwithcode.jpeg" alt="How to level up your customer support and AI using Airy and Rasa X"><p>Today we will learn how your chatbot can help your customer support agents and how they can feed your bot with new training data; all of this while providing a best-of-class experience to your users.</p><h3 id="background">Background</h3><p>Natural language chatbots in 2021 still follow a simple model: When a user writes a message a machine learning model reduces the message to a small set of preconfigured user intentions (short: intents), which are used to predict the next action. This is great, because machines still struggle with the fuzziness of human language, but work really well with deterministic input.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/image-8.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1444" height="788" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-8.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-8.png 1000w, https://blog.airy.co/content/images/2021/06/image-8.png 1444w" sizes="(min-width: 720px) 720px"></figure><p>But what to do in case of multiple intents? It&apos;s not uncommon that, well-aware of how much context matters, users open a conversation with a wall of text about how they got into their predicament and what they should do next. This stops any intent-based bots dead in their tracks. Therefore it&apos;s become a common pattern to forward requests like these to human support agents. But those agents usually have to spend lots of time catching up on conversations and writing responses. And when they do it&apos;s hard to incorporate the solutions reached in these human to human conversations back into the response model.</p><p>So imagine instead of having to type an answer the bot could present you with a list of best guesses that you only have to choose from. That way you save agents time and can re-train your bot on the response selection. In this post we will explore how to do this using Rasa X and the Airy conversational platform using the following seven steps:</p><ol><li><a href="#step-1-install-airy">Install Airy</a></li><li><a href="#step-2-install-rasa-x ">Install Rasa X </a></li><li><a href="#step-3-configure-rasa">Configure Rasa</a></li><li><a href="#step-4-deploy-the-action-server">Deploy the action server</a></li><li><a href="#step-5-deploy-the-airy-connector">Deploy the Airy connector</a></li><li><a href="#step-6-point-airy-at-rasa">Point Airy at Rasa</a></li><li><a href="#step-7-doing-a-test-run">Doing a test run</a></li></ol><hr><h3 id="step-1-install-airy">Step 1: Install Airy</h3><p>To get started we need a running Airy and Rasa X installation. The fastest way to get setup with Airy is to <a href="https://airy.co/docs/core/cli/introduction">install the CLI</a> and then run</p><pre><code class="language-bash">airy create --provider minikube my-app</code></pre><p>to install a local development environment of Airy. This requires &gt;8Gb of available RAM so if you want you can also jump ahead and get setup for production immediately by following <a href="https://airy.co/docs/core/getting-started/installation/introduction#install-airy-core">our guides</a>.</p><p>The output of this command will look something like this:</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/image-3.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1616" height="924" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-3.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-3.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/06/image-3.png 1600w, https://blog.airy.co/content/images/2021/06/image-3.png 1616w" sizes="(min-width: 720px) 720px"></figure><p>You can now open the inbox UI, where we will be responding to user messages, at <code>http://airy.core/ui/</code>	</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/image-9.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1918" height="1001" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-9.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-9.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/06/image-9.png 1600w, https://blog.airy.co/content/images/2021/06/image-9.png 1918w" sizes="(min-width: 720px) 720px"><figcaption>The Airy inbox allows you to manage conversations from all your messaging channels and sources.</figcaption></figure><h3 id="step-2-install-rasa-x">Step 2: Install Rasa X </h3><p>To get set up with Rasa X we will follow their <a href="https://rasa.com/docs/rasa-x/installation-and-setup/install/quick-install-script">Server Quick Install Guide</a>. You can use any cloud provider or already running server you like, but for this example we created a Ubuntu 20.04 VM instance on Google cloud. Be sure to check that your system fulfils the <a href="https://rasa.com/docs/rasa-x/installation-and-setup/install/quick-install-script#embedded-cluster-requirements">minimum requirements</a> before installing.</p><p>Next we execute the following command on our running VM:</p><pre><code class="language-bash">curl -s get-rasa-x.rasa.com | sudo bash
</code></pre><p>This will install an embedded Kubernetes cluster with Rasa X running as a Helm chart. Take note of the login information!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-22-at-17.49.19.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1584" height="1184" srcset="https://blog.airy.co/content/images/size/w600/2021/06/Screenshot-2021-06-22-at-17.49.19.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/Screenshot-2021-06-22-at-17.49.19.png 1000w, https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-22-at-17.49.19.png 1584w" sizes="(min-width: 720px) 720px"><figcaption>Output from a successful Rasa X installation</figcaption></figure><h3 id="step-3-configure-rasa">Step 3: Configure Rasa</h3><p>If you want to follow along and tweak things you can check out the Rasa code here: <a href="https://github.com/airyhq/rasa-x-demo">https://github.com/airyhq/rasa-x-demo</a> </p><p>Whenever the bot is faced with a message it does not understand, we want Rasa to call a special action <code>suggest_replies</code> that passes a set of messages to Airy so that they can be displayed in the inbox.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/image-4.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1574" height="856" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-4.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-4.png 1000w, https://blog.airy.co/content/images/2021/06/image-4.png 1574w" sizes="(min-width: 720px) 720px"><figcaption>Airy forwards messages via webhook to Rasa which predicts the <code>suggest_replies</code> action. This action then compiles and forward the suggestions to Airy, where customer support can make use of them. Finally the response selection ends up in Rasa X where we can use it to re-train our model.</figcaption></figure><p>The recommended way to configure Rasa X is to <a href="https://rasa.com/docs/rasa-x/installation-and-setup/deploy/#integrated-version-control">connect it to a git repository</a>. This way the following changes to your configuration files will be synced whenever you push to your version control server. You can either create a new repository by running <code>rasa init</code> or by forking the demo repository. </p><p>First we need to teach Rasa to handle messages that have ambiguous intents. By adding the <code><a href="https://rasa.com/docs/rasa/components#fallbackclassifier">FallbackClassifier</a></code> to the NLU pipeline, we can make Rasa predict an <code>nlu_fallback</code> intent whenever that is the case. &#xA0;We also add the <code>RulePolicy</code> to catch any messages with a low prediction confidence and a static rule that predicts <code>suggest_replies</code> whenever the intent fallback occurs.</p><figure class="kg-card kg-code-card"><pre><code class="language-yaml">pipeline:
  # ...
  - name: FallbackClassifier
    threshold: 0.7

policies:
  - name: RulePolicy
    core_fallback_threshold: 0.7
    core_fallback_action_name: &quot;action_suggest_replies&quot;
    enable_fallback_prediction: True
</code></pre><figcaption>config.yaml</figcaption></figure><figure class="kg-card kg-code-card"><pre><code class="language-yaml">- rule: Suggest replies to agent as a fallback
  steps:
  - intent: nlu_fallback
  - action: action_suggest_replies
  - action: action_listen
</code></pre><figcaption>rules.yaml</figcaption></figure><h3 id="step-4-deploy-the-action-server">Step 4: Deploy the action server</h3><p>At this point the bot will fail whenever the <code>action_suggest_replies</code> action is triggered, because we still haven&apos;t hosted that action with our custom action server. To do we can configure our running Rasa X instance to use the pre-built image with our action code by setting:</p><pre><code class="language-bash">export ACTION_SERVER_IMAGE=&quot;ghcr.io/airyhq/rasa-x-demo/rasa&quot;
export ACTION_SERVER_TAG=&quot;latest&quot;</code></pre><p>Now when we run:</p><pre><code class="language-bash">curl -s get-rasa-x.rasa.com | sudo -E bash</code></pre><p>the <code>-E</code> option will pass our variables to the Rasa X init script, which will configure the existing cluster for us.</p><h3 id="step-5-deploy-the-airy-connector">Step 5: Deploy the Airy connector</h3><p>In order for the <code>rasa-production</code> deployment to receive messages from Airy we need to deploy the same image tag we used for the action server to <code>rasa-production</code> and configure the credentials file.</p><p>Since the <code>get-rasa-x</code> script does not allow us to configure the image we manually edit the deployment by running:</p><pre><code class="language-bash">kubectl edit deployment rasa-production</code></pre><p>and replacing the image tag with <code>ghcr.io/airyhq/rasa-x-demo/rasa:latest</code> .</p><p>Next we need to ensure that Rasa can call your Airy API to send suggested replies. Since the default <code>airy.core</code> domain is only locally accessible we will exchange that domain for the default tunnel that ships with Airy. To do so run the following command on the machine that is running Airy:</p><pre><code>echo &quot;https://$(minikube -p airy-core kubectl -- get cm core-config -o jsonpath=&apos;{.data.CORE_ID}&apos;).tunnel.airy.co&quot;</code></pre><p>This should yield a URL like <code>https://kz0vto4fss.tunnel.airy.co</code> . Next we run</p><pre><code class="language-bash">kubectl edit ingress airy-core</code></pre><p>and replace the <code>host</code> part with an asterisk <code>*</code> to match traffic from the tunnel. Finally we want to tell Rasa about this URL by running the following command on the Rasa machine:</p><pre><code class="language-bash">kubectl edit cm rasa-rasa-configuration-files</code></pre><p>and then updating the config map so that it looks like so:</p><pre><code class="language-yaml">apiVersion: v1
data:
  rasa-credentials: |
    rasa:
      url: http://rasa-rasa-x-rasa-x.rasa.svc:5002/api

    channels.airy.AiryInput:
      airy_host: https://kz0vto4fss.tunnel.airy.co
# ...</code></pre><p>For this change to take effect we now need to delete all <code>rasa-x</code> and the <code>rasa-production</code> pods. This will cause them to restart and fetch the new configuration. Lastly we need to ensure that our Rasa pod is accessible from the outside so that Airy can call its webhook. We do so by exposing its service as a load balancer:</p><pre><code class="language-bash">kubectl expose service rasa-rasa-x-rasa-production rasa-production-load-balancer --type LoadBalancer --port 5005 --target-port 5005
</code></pre><p>Now your Airy Rasa webhook should be publicly available at <code>http://your.vm.ip.address:5005/webhooks/webhook/airy</code>. NOTE: Check your firewall rules on an instance and on a provider level to make sure that your VM can send and receive requests at port <code>5005</code></p><h3 id="step-6-point-airy-at-rasa">Step 6: Point Airy at Rasa</h3><p>Now that our Rasa instance is ready to send and receive events to and from Airy we need to point our Airy webhook to it. To do so we call the <a href="https://airy.co/docs/core/api/webhook#subscribing">webhook subscribe</a> endpoint like so:</p><pre><code class="language-bash">curl --request POST \
  --url http://airy.core/webhooks.subscribe \
  --header &apos;Content-Type: application/json&apos; \
  --data &apos;{&quot;url&quot;: &quot;http://your.vm.ip.address:5005/webhooks/webhook/airy&quot;}&apos;</code></pre><p>Where the url part of the payload is the Rasa webhook url we exposed in the previous step. Once this is done your Airy instance should forward all its messages to Rasa. </p><h3 id="step-7-doing-a-test-run">Step 7: Doing a test run</h3><p>You can test this setup by sending an example message to the Airy website ChatPlugin. If you are new to Airy the fastest way to do so is by following our Chat Plugin <a href="https://airy.co/docs/core/sources/chatplugin/quickstart">quickstart guide</a>. Once you have sent a message, the bot should reply and the message should show up in Rasa X.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/image-5.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="2000" height="982" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-5.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-5.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/06/image-5.png 1600w, https://blog.airy.co/content/images/size/w2400/2021/06/image-5.png 2400w" sizes="(min-width: 720px) 720px"><figcaption>View of the Airy Inbox when the message comes in and the bot has replied.</figcaption></figure><p>Now in the screenshot above I have sent a greeting to which the Rasa bot replied. Next I&apos;ve sent an example message aimed at generating ambiguous intents. As you can see there is now a suggestions box right above the input bar. When I click that I&apos;m being presented with the suggested replies that the bot generated. </p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/image-6.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="2000" height="981" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-6.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-6.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/06/image-6.png 1600w, https://blog.airy.co/content/images/size/w2400/2021/06/image-6.png 2400w" sizes="(min-width: 720px) 720px"></figure><p>If you know open your Rasa X installation you should see the conversation appearing in the conversations tab. Here you can now copy and modify the user story and use it to train a new model.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/image-10.png" class="kg-image" alt="How to level up your customer support and AI using Airy and Rasa X" loading="lazy" width="1608" height="814" srcset="https://blog.airy.co/content/images/size/w600/2021/06/image-10.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/image-10.png 1000w, https://blog.airy.co/content/images/size/w1600/2021/06/image-10.png 1600w, https://blog.airy.co/content/images/2021/06/image-10.png 1608w" sizes="(min-width: 720px) 720px"><figcaption>Rasa X conversations tab</figcaption></figure><h3 id="benefits">Benefits</h3><p>To recap: You now have a tool that when your bot needs to fallback on customer support it can, in most cases, provide them with response suggestions. </p><h4 id="save-customer-support-time">Save customer support time</h4><p>Catching up on the context of a conversation and crafting a high quality, grammatically correct response is very time consuming. With this approach your agents can simply moderate many of their conversations. </p><h4 id="get-more-labelled-training-data">Get more labelled training data</h4><p>Usually when we hand over to customer support the ensuing conversation is hard to decipher and map back to a desired bot model. By using response selections you get an entirely new dataset of real conversations to train on.</p><h4 id="greatly-improve-user-experience">Greatly improve user experience</h4><p>By training the bots on the response selection of the user given the user message you are in essence &quot;filling the gaps&quot; of your bots coverage. This means faster and more accurate responses for future interactions.</p><p>This interaction pattern also implements a new paradigm in chatbot design: <strong>Conversation Driven Development (CDD)</strong>, in which you start with a very simple bot and rapidly iterate using real world data to get a better experience both faster and cheaper.</p><h3 id="limitations">Limitations</h3><ul><li>Currently replies are only suggested during the handover.</li><li>Depending on your NLU engine intent ambiguity can be hard to train for, since Rasa for instance tends to give a stark confidence contrast between predictions.</li><li>Fairly complex deployment.</li></ul><h3 id="where-to-go-from-here">Where to go from here</h3><p>If you are struggling to complete this guide or want to share your journey, you can <a href="https://join.slack.com/t/airy-developers/shared_invite/zt-ijzwq90x-QT1D75BtaxK3JWehsB_FEg">join</a> our Developer slack! </p><p>And if you&apos;re interested in implementing this solution in production and at scale, we&apos;d be happy to <a href="https://airy.co/get-a-demo">help you</a>!</p><!--kg-card-begin: html--><button class="cta-btn" onclick="window.location.href=&apos;https://github.com/airyhq/airy&apos;;"> Support us by giving us a Star &#x2B50; on Github </button>

<style>
	.cta-btn {
		background: #4BB4FD;
        color: white;
		font-size: 20px;
		border-radius: 8px;
		padding: 12px 28px;
    }
</style><!--kg-card-end: html-->]]></content:encoded></item><item><title><![CDATA[Introducing Data Lakes for Conversational Data]]></title><description><![CDATA[Companies that leverage their conversational data today will quickly outpace competitors. Introducing Airy Conversational Data Lakes: Enabling your company to store and utilize conversational data.]]></description><link>https://blog.airy.co/introducing-data-lakes-for-conversational-data/</link><guid isPermaLink="false">60bf40acdee00d62d01316bb</guid><category><![CDATA[Industry Trends]]></category><category><![CDATA[Product Updates]]></category><dc:creator><![CDATA[Pascal Holy]]></dc:creator><pubDate>Tue, 08 Jun 2021 13:41:27 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/06/lake_1200.png" medium="image"/><content:encoded><![CDATA[<h3 id="data-lakes-have-continued-to-cement-their-strong-position-in-todays-data-driven-world-more-and-more-companies-are-benefiting-from-separating-their-needs-to-store-data-and-their-needs-to-analyze-it-the-data-lake-solutions-provided-by-current-cloud-providers-are-mature-cost-efficient-incredibly-versatile-and-future-proof">Data Lakes have continued to cement their strong position in today&apos;s data driven world. More and more companies are benefiting from separating their needs to store data and their needs to analyze it. The data lake solutions provided by current cloud providers are mature, cost-efficient, incredibly versatile and future-proof.</h3><h2 id="what-are-data-lakes"><br>What are Data Lakes?</h2><img src="https://blog.airy.co/content/images/2021/06/lake_1200.png" alt="Introducing Data Lakes for Conversational Data"><p><br>Essentially, data lakes are a place to store all of your data for analytical purposes, much like a data warehouse. The easiest way to differentiate the two, is that a data lake does not include the means to do any sort of calculations with it. It is nothing more than a very reliable and infinitely scalable object store. The benefit of that, is that you only pay for the storage you use. Unlike a data warehouse, where you either pay for empty hard disks or idling server resources.</p><p>Meanwhile with a data lake, where storage and compute is decoupled, you can leverage the unstructured nature of data lakes and import data in any shape, volume or form. <br><br>Furthermore, you can choose to enable crucial features by ticking a few boxes, like encryption and data livecycle policies, for when you want to save even more money on less frequently accessed slices of your data.</p><p>But, because storage alone doesn&#x2019;t provide any insights, data lake providers have worked hard to integrate well with the existing analytics solutions out there, enabling customers to use the tools they are already familiar with, like SQL and Apache Spark.</p><h2 id="what-makes-conversational-data-special"><br>What makes Conversational Data special?</h2><p><br>At Airy, we have been processing conversational datasets for many years and have seen the exponential growth of this new domain for customer-business conversations. With the shift to messaging, more and more customer interactions, from pre-purchase questions to customer service requests, become conversational.<br><br>The data itself has a high signal vs noise ratio, since inside of the data you can find answers to all sorts of questions, from findings about bottlenecks in internal company processes to preferences of customers. <br><br>Unfortunately, conversational data is currently highly siloed: customer service chats are often exclusively stored inside the customer service database, or pre-purchase related questions stay within the tools that social media teams use to engage with people on social media channels. <br><br>Because there are a multitude of different messaging providers that structure their data differently, it is hard to aggregate this data in a way that enables economical storage and still provides the ability to do powerful analytics in a variety of tools. <br><br>Conversational AI, spearheaded by teams like Rasa and Dialogflow by Google, aims to automate many conversational use cases. Finely grained real conversational data makes it possible to train models in real time.</p><p>We have seen entire customer service departments overhauled, as conversational data has been successfully used to train models in order to automate FAQs, freeing agents to tackle more complicated requests, further improving response times and customer satisfaction. </p><h2 id="airy%E2%80%99s-conversational-data-lakes-on-aws">Airy&#x2019;s Conversational Data Lakes on AWS</h2><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://lh3.googleusercontent.com/AkCU7KkqdlDJXOYJrIps2N0w0ZCSkvmOUjE6e7umJs_i4wqFEhHvPIr3GEPe3llHErWcfskTv6Xy-HlY_pNrwva6QLVfhDu4Fbtfnt-dheZsSmQe1GtMErSicyedH2sucKxlixEL" class="kg-image" alt="Introducing Data Lakes for Conversational Data" loading="lazy"><figcaption>Conversational Data Lake Architecture on AWS</figcaption></figure><p>On AWS, conversational data can be stored in Amazons S3, providing for a cheap storage option for massive amounts of data. To optimize for performance when querying data, raw AVRO formatted conversational data is converted into compressed Apache Parquet files.</p><p>To eliminate manual schema management, we also provide schema information with our AWS Glue data catalog integration. With this, you can get a clear overview of all conversational data, down to the last column, and query it instantly with AWS Athena.</p><p>From here, it&apos;s very easy to integrate in all the standard tools that the data science and analytics communities love.</p><h3 id="encrypted-by-default">Encrypted by default</h3><p><br>For regulatory compliance and data protection it is often required to encrypt data at rest. We handle this with Amazon S3-managed encryption keys which means your data is secured with AES-256 block cyphers before it is written to disk. Arguably even more important is to also encrypt your data in flight because it then leaves your infrastructure and is sent to S3. For this case we support SSL/TLS encryption to prevent anybody from eavesdropping on your traffic.</p><h3 id="future-proof-for-advancements-in-conversational-experiences">Future-proof for advancements in conversational experiences</h3><p><br>Conversational providers such as WhatsApp and Facebook Messenger are constantly innovating, varying their feature sets and offering different use cases. To stay on the leading edge our engineers have developed a data model achieving symbiosis of structured and semi structured data. <br><br>Our highly scalable ingestion platform, based on Kafka, streams the data into our system where the conversational data from all providers is resolved into uniform schemas of channels and messages.<br><br>We also support unstructured metadata throughout our system for when additional information is required. An example of this metadata would be internal open-done states, clearly showing in a UI if a conversation needs a human response.</p><h3 id="integrating-into-your-existing-data-lakes-and-warehouses">Integrating into your existing Data Lakes and Warehouses</h3><p><br>The real value of your conversational data becomes apparent when you merge it in real time with your existing data, creating a more complete model of your customers. <br><br>If you have a lot of existing data, your Airy Data Lake integrates seamlessly with analytics tools like Apache Spark for powerful analytics at any scale.<br></p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/DazEWN3q3DMMxUgJuHdpNglsDKhgeWiISobzACinAarKgheIEbjsWFcSBqKdbzaVCzyth7BcU8iEJoxOzGp-wFZg4WA73w1_DYElsAxxSNx9SYc6izgnAD6IOJnjeUYBSQLZ_9EK" class="kg-image" alt="Introducing Data Lakes for Conversational Data" loading="lazy"></figure><!--kg-card-begin: html--><button class="cta-btn" onclick="window.location.href=&apos;https://github.com/airyhq/airy&apos;;"> Support us by giving us a Star &#x2B50; on Github </button>

<style>
	.cta-btn {
		background: #4BB4FD;
        color: white;
		font-size: 20px;
		border-radius: 8px;
		padding: 12px 28px;
    }
</style><!--kg-card-end: html--><h2 id="the-power-of-conversational-data-lakes">The Power of Conversational Data Lakes</h2><p><br>Understanding your conversational data better is essential for providing your customers with the best possible service. From identifying patterns in your customer support cases and analysing customer preferences to training your Conversational AI model with Rasa or Dialogflow, it all becomes easily accessible with a data lake.</p><p>In a recent case with one of our large retail customers, their Airy Conversational Data Lake has enabled them to do a deep dive on their Google Business Messaging data to improve their customer satisfaction rating. <br><br>By combining the results of the Customer Satisfaction Score (CSAT) with the source data on a per conversation basis, it is possible to pinpoint exactly which kind of conversations are leading to lower scores. With these insights, solutions can be precisely designed and implemented, leading to greatly improved customer satisfaction and reduced customer churn, all with minimal cost and effort from the company&#x2019;s side. <br><br>Visualization is also just a query away: This simple breakdown shows CSAT by the day of the week. It&#x2019;s immediately clear that the majority of negative scores are reported for conversations beginning towards the end of the week. The customer service department of this particular company is comparatively understaffed over the weekend, leading to increased customer waiting times for a response. Percentage wise, dissatisfaction is at its highest over the weekend. <br><br>Using a simple visualisation like this, one can glean that a few more service agents on duty on Saturdays could improve CSAT dramatically. <br><br></p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/CSAT.png" class="kg-image" alt="Introducing Data Lakes for Conversational Data" loading="lazy" width="1440" height="720" srcset="https://blog.airy.co/content/images/size/w600/2021/06/CSAT.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/CSAT.png 1000w, https://blog.airy.co/content/images/2021/06/CSAT.png 1440w" sizes="(min-width: 720px) 720px"></figure><h2 id="key-benefits-of-conversational-data-lakes">Key Benefits of Conversational Data Lakes</h2><p><br>Setting up a Conversational Data Lake now (or integrating conversational data in your existing data lake) sets your company and data teams up for a future where most customer and business interactions are conversational and automated by Conversational AI. Having your own conversational data lake leads to:</p><ul><li>Much lower costs for data storage (compared to most data warehousing solutions)</li><li>Immediate insights into customers&apos; preferences and behavior</li><li>Reduced engineering times for data scientists</li><li>Readiness for the exponential growth of conversational data &amp; your own conversational AI models</li></ul><h3 id="get-started-today"><br>Get started today!<br></h3><p>Airy&#x2019;s Conversational Data Lakes is available to all Enterprise Customers. Get started today with our <a href="https://blog.airy.co/tutorial-airy-installation-aws/">Quickstart Guide</a>.</p><!--kg-card-begin: html--><button class="cta-btn" onclick="window.location.href=&apos;https://github.com/airyhq/airy&apos;;"> Support us by giving us a Star &#x2B50; on Github </button>

<style>
	.cta-btn {
		background: #4BB4FD;
        color: white;
		font-size: 20px;
		border-radius: 8px;
		padding: 12px 28px;
    }
</style><!--kg-card-end: html-->]]></content:encoded></item><item><title><![CDATA[Airy 101: An Introduction]]></title><description><![CDATA[Hi, my name is Liz from Airy, and I want to take the next 4 minutes to talk about Airy's conversational platform, what it does and how it can help you.]]></description><link>https://blog.airy.co/airy-101-an-introduction/</link><guid isPermaLink="false">60be2630dee00d62d01310ec</guid><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Liz Hutter]]></dc:creator><pubDate>Tue, 08 Jun 2021 09:33:17 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/06/Screenshot-2021-04-29-at-16.32.43.png" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/zwDosYHitYg?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-04-29-at-16.32.43.png" alt="Airy 101: An Introduction"><p>Hi, my name is Liz from Airy, and I want to take the next few minutes to talk about Airy&apos;s conversational platform, what it does and how it can help you. Conversational solutions are everywhere, whether it be on your website as live chat, on app as an app chat, or on social platforms that you use every day, like Facebook, Google, or WhatsApp.<br></p><p>Although they&apos;re everywhere, conversational experiences tend to be siloed and fragmented, especially once you integrate with your existing tools and business systems. However, with Airy you can store, structure and utilize all conversational data. So you can connect sources to Airy such as website and app chat, Google, Facebook, WhatsApp, or custom sources.<br></p><p>And once you connect these sources, you can build different conversational use cases. So first is conversational AI, where you can train your models with conversational data via Rasa or Dialogflow. You can enhance your business systems by integrating with your CRM, such as Salesforce or help desk like Zendesk. And you can store data in data warehouses like Snowflake or Amazon Redshift.<br></p><p>Airy is super easy, meaning that with these two commands, the whole platform is already up and running. Airy comes with many components, including blazing fast APIs, you can access your platform programmatically, via graphical user interfaces, and pre-built integrations into all of your systems.<br></p><p>It&apos;s possible to connect all major conversational sources like Facebook, Google, or WhatsApp to Airy, but we also brought you an open-source chat plugin, which you&apos;re able to customize yourself. The live chat that we give you is already a part of Airy and we have even more SDKs that are coming. In terms of how to access your Airy instance, you can do it with:</p><ul><li><a href="https://airy.co/docs/core/api/introduction">APIs</a> where you can view conversations, messages, users, and more</li><li><a href="https://airy.co/docs/core/api/websocket">WebSocket</a> where you can power real time applications</li><li><a href="https://airy.co/docs/core/api/webhook">Webhook</a> where you can listen to, or subscribe to events, and react accordingly</li></ul><p>You can participate in conversations programmatically, or integrate your Airy instance with a custom platform.<br></p><p>We know that sometimes code is not enough, so we&apos;ve created <a href="https://airy.co/docs/core/ui/introduction">UIs</a> for you. You can see all conversations and messages from all connected sources, you have a fully functioning <a href="https://airy.co/docs/core/ui/inbox">inbox</a> with features such as a channel UI, tags and templates.<br></p><p>As a conversational platform, Airy also cares a lot about conversational AI. So we support all conversational AI providers like Rasa, Dialogflow, and IBM Watson. Even in the inbox, you can see what your conversational AI would suggest with <a href="https://airy.co/docs/core/integrations/rasa-suggested-replies">Suggested Replies</a>. Airy runs on Kafka, so you can scale it as much as you want to, meaning that there are no limits.<br></p><p>So for example, in our own instance, we have millions of conversations. You could also host it anywhere, whether it be Airy cloud, your own AWS account or whichever hosting provider you have. And in addition to this, everything that you see is open-source.<br></p><p><a href="https://airy.co/docs/core/cli/introduction">You can install the CLI with one command</a>, and then you&apos;re able to do anything to create or control or administrate your area instance.<br></p><p>You can test it <a href="https://airy.co/docs/core/getting-started/installation/minikube">locally on your own machine</a>, or you can <a href="https://airy.co/docs/core/getting-started/installation/aws">deploy it on the cloud</a>, whether it be with us on Airy cloud or on your own AWS, Azure, or Google cloud. As you can see, an instance was created on AWS on the right hand side, just while I was explaining this to you. So <a href="https://airy.co/community">join our Slack to get community support</a>, and in addition, we also have a dedicated enterprise support team. Start a conversation with us on <a href="http://airy.co">airy.co</a> and we look forward to hearing from you.<br></p>]]></content:encoded></item><item><title><![CDATA[Getting Started with Airy on AWS]]></title><description><![CDATA[This Tutorial shows you how to install and set up an Airy instance remotely on Amazon AWS from your local machine to be used in production.]]></description><link>https://blog.airy.co/tutorial-airy-installation-aws/</link><guid isPermaLink="false">60be01046f273b7b1bb7f4da</guid><category><![CDATA[Tutorials]]></category><dc:creator><![CDATA[Steffen Hoellinger]]></dc:creator><pubDate>Tue, 08 Jun 2021 08:18:20 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-17.36.36.png" medium="image"/><content:encoded><![CDATA[<h4 id="this-tutorial-shows-you-how-to-install-and-set-up-an-airy-instance-remotely-on-amazon-web-services-aws-from-your-local-machine-to-be-used-in-production">This tutorial shows you how to install and set up an Airy instance remotely on Amazon Web Services (AWS) from your local machine to be used in production. </h4><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-18.39.03.png" class="kg-image" alt="Getting Started with Airy on AWS" loading="lazy" width="962" height="217" srcset="https://blog.airy.co/content/images/size/w600/2021/06/Screenshot-2021-06-07-at-18.39.03.png 600w, https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-18.39.03.png 962w" sizes="(min-width: 720px) 720px"><figcaption>&#x1F389; This is what you&apos;ll see once your Airy Core instance is ready.</figcaption></figure><h3 id="0-make-sure-you-have-the-latest-versions-of-the-aws-cli-airy-cli-jq-running-on-your-local-machine">0. Make sure you have the latest versions of the AWS CLI, Airy CLI &amp; jq running on your local machine</h3><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-17.36.36.png" alt="Getting Started with Airy on AWS"><p><br>You should already have a configuration on your local machine for which the following things were true - or should have completed the following steps: </p><ul><li>The <a href="https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html">AWS CLI should be installed</a> and <a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html">properly configured</a> on your local machine.</li><li><a href="https://airy.co/docs/core/cli/introduction#step-2-install-the-airy-cli">Download and install the Airy CLI</a>, in case you haven&apos;t done so yet. If you are on Mac using <a href="https://brew.sh/">Homebrew</a>, just run the following in Terminal:</li></ul><pre><code class="language-shell">brew install airyhq/airy/cli</code></pre><p> </p><ul><li>In Terminal, check if you have the latest <code>airy version</code> of the Airy CLI that will install the corresponding version of Airy on AWS in the next step.</li></ul><figure class="kg-card kg-code-card"><pre><code class="language-shell">airy version</code></pre><figcaption>Expected Response: &quot;<code>Version: 0.23.0, GitCommit: 01f187224f441d184cfeb501c2321035a17306db</code>&quot;</figcaption></figure><ul><li>We also recommend to <a href="https://stedolan.github.io/jq/download/">install the latest version of jq</a>, which will be required during the installation process. Using Homebrew on a Mac, run the following in Terminal:</li></ul><pre><code class="language-shell">brew install jq</code></pre><h3 id="1-create-a-new-remote-airy-instance">1. Create a new remote Airy instance</h3><ul><li>First, create or choose the directory on your local machine that you want to use to manage your remote Airy instance. </li></ul><pre><code class="language-shell">mkdir airy-aws
cd airy-aws</code></pre><ul><li>Then, remotely create a new Airy instance by running the following command in Terminal:</li></ul><pre><code class="language-shell">airy create --provider aws</code></pre><p>This command will set up a Kubernetes cluster using AWS EKS, and will give you our recommended two c5.xlarge EC2 instances in the relevant region that your AWS profile is configured for by default. Please check the relevant pricing applicable for your region before continuing here. The overall costs for our recommended configuration might amount to about $300-350 per month in most regions. </p><p>You can customize the deployment e.g. by using different instance types (e.g. c5.large for a less powerful configuration for your Airy instance), or deploy Airy to an existing Kubernetes cluster, as outlined in the <a href="https://airy.co/docs/core/getting-started/installation/aws#create-a-cluster">relevant section of our documentation</a>.</p><p>Please stay patient once you run the command. The script needs to install quite a lot of infrastructure, so this might run for about 20-30 minutes. Have a coffee in the meanwhile! &#x2615;</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-17.54.48-2.png" class="kg-image" alt="Getting Started with Airy on AWS" loading="lazy" width="962" height="707" srcset="https://blog.airy.co/content/images/size/w600/2021/06/Screenshot-2021-06-07-at-17.54.48-2.png 600w, https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-17.54.48-2.png 962w" sizes="(min-width: 720px) 720px"><figcaption>This means success. &#x1F389;</figcaption></figure><p>The installation script created three files for you in the directory you chose before on your local machine: </p><ul><li><code>airy.yaml</code>: The configuration file for your Airy instance</li><li><code>cli.yaml</code>: The configuration file for your local Airy CLI</li><li><code>kube.conf</code>: The configuration file for your remote Kubernetes cluster </li></ul><p>You can verify that everything has been installed correctly by looking the pods of the remote Kubernetes cluster. Just type in the following command in the relevant directory on your local machine:</p><pre><code class="language-shell">kubectl get pods --kubeconfig ./kube.conf</code></pre><p>This will return a list of all the pods currently running on your remote cluster: </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-18.37.09.png" class="kg-image" alt="Getting Started with Airy on AWS" loading="lazy" width="962" height="553" srcset="https://blog.airy.co/content/images/size/w600/2021/06/Screenshot-2021-06-07-at-18.37.09.png 600w, https://blog.airy.co/content/images/2021/06/Screenshot-2021-06-07-at-18.37.09.png 962w" sizes="(min-width: 720px) 720px"><figcaption>Congrats! &#x1F64C; This is how your remote Kubernetes cluster should look like on AWS.&#xA0;</figcaption></figure><hr><h3 id="2-configure-https-for-your-airy-instance">2. Configure HTTPS for your Airy instance </h3><p><br>We strongly recommend to activate and configure HTTPS for your Airy instance as you should only send and receive conversational data securely. Most conversational sources such as Facebook Messenger also require you to activate HTTPS and no longer support sending or receiving messages via HTTP only.</p><p>We assume you have already acquired a SSL certificate for the domain name you intend to use for your Airy instance from your domain name registrar or provider - or you might already be in possession of a Wildcard Certificate for your domain name. As you might be aware, there are also free alternatives available such as <a href="https://letsencrypt.org/">Let&apos;s Encrypt</a> to manage and obtain SSL certificates for the domain name you intend to use with your Airy instance.</p><h4 id="import-your-ssl-certificate-into-amazon-certificate-manager-acm">Import your SSL certificate into Amazon Certificate Manager (ACM) </h4><p>Run the following command in Terminal, specifying the SSL certificate and the private key file, as well as the certificate chain bundle file as the case may be. Please make sure to include <code>fileb://</code> before the path to each file on your local machine.</p><figure class="kg-card kg-code-card"><pre><code class="language-shell">aws acm import-certificate --certificate fileb://_CERTIFICATE_FILE_ \
      --certificate-chain fileb://_CERTIFICATE_CHAIN_BUNDLE_FILE_ \
      --private-key fileb://_PRIVATE_KEY_FILE_ </code></pre><figcaption>Expected Response: {&quot;CertificateArn&quot;: &quot;arn:aws:acm:&lt;REGION&gt;:&lt;ACCOUNT&gt;:certificate/&lt;UUID&gt;&quot;}</figcaption></figure><h4 id="configure-the-ingress-service">Configure the Ingress service</h4><p>Run the following command in Terminal, copying and replacing &#xA0;<code>_YOUR_CERTIFICATE_ARN_</code> with the value of <code>CertificateArn</code> from the previous step:</p><figure class="kg-card kg-code-card"><pre><code class="language-shell">kubectl -n kube-system annotate service traefik &quot;service.beta.kubernetes.io/aws-load-balancer-ssl-ports=443&quot; &quot;service.beta.kubernetes.io/aws-load-balancer-ssl-cert=_YOUR_CERTIFICATE_ARN_&quot; --kubeconfig ./kube.conf</code></pre><figcaption>Expected Response: &quot;service/traefik annotated&quot;</figcaption></figure><p>Then run the following command in Terminal:</p><figure class="kg-card kg-code-card"><pre><code class="language-shell">kubectl -n kube-system patch service traefik --patch &apos;{&quot;spec&quot;: { &quot;ports&quot;: [ { &quot;name&quot;: &quot;https&quot;, &quot;port&quot;: 443, &quot;protocol&quot;: &quot;TCP&quot;, &quot;targetPort&quot;: 80 } ] } }&apos; --kubeconfig ./kube.conf</code></pre><figcaption>Expected Response: &quot;service/traefik patched&quot;</figcaption></figure><p>Start with exporting the Host URL that you want to use for your Airy instance that should correspond to the URL (or in case of a wildcard certificate a URL) that your certificate is issued for: &#xA0;</p><pre><code class="language-shell">export HOST=airy.example.com </code></pre><p>Now, update the hostnames configmap and update the Ingress extensions by running the following commands in Terminal: </p><figure class="kg-card kg-code-card"><pre><code class="language-shell">kubectl --kubeconfig ./kube.conf -n default patch configmap hostnames --patch &apos;{&quot;data&quot;: { &quot;HOST&quot;: &quot;https://${HOST}&quot;} }&apos;
kubectl --kubeconfig ./kube.conf -n default get ingress airy-core -o json | jq &apos;(.spec.rules[0].host=&quot;${HOST}&quot;)&apos; | kubectl --kubeconfig ./kube.conf apply -f -
kubectl --kubeconfig ./kube.conf -n default get ingress airy-core-ui -o json | jq &apos;(.spec.rules[0].host=&quot;${HOST}&quot;)&apos; | kubectl --kubeconfig ./kube.conf apply -f -</code></pre><figcaption>Expected Response: &quot;configmap/hostnames patched, ingress.extensions/airy-core configured&quot;</figcaption></figure><p>Finally print the hostname of the Ingress service in Terminal by running this command:</p><figure class="kg-card kg-code-card"><pre><code class="language-shell">kubectl -n kube-system get service traefik -o jsonpath=&apos;{.status.loadBalancer.ingress[0].hostname}&apos; --kubeconfig ./kube.conf</code></pre><figcaption>Expected Response: &quot;&lt;SOME_STRING&gt;.elb.&lt;REGION&gt;.amazonaws.com&quot;</figcaption></figure><p>Now, head over to your domain name registrar or provider - respectively where you manage the name servers for your domain. Please add a new <code>CNAME</code> entry for the relevant domain name you intend to use with your Airy instance and point it to the hostname of the Ingress service as the destination as printed above, for example: &#xA0;</p><pre><code class="language-markdown">CNAME airy.example.com -&gt; 
&quot;abcdefgh0123456789.elb.us-east-1.amazonaws.com&quot;</code></pre><p>Wait for a certain time until the name servers should have been updated and DNS servers populated with the new information across the internet. You should then be able to reach your Airy instance by accessing <code>https://&lt;YOUR_DOMAIN&gt;/ui</code> from any modern web browser. &#xA0; </p><hr><h3 id="3-secure-your-new-airy-instance-%F0%9F%94%90">3. &#xA0;Secure your new Airy instance &#x1F510;</h3><h4 id> </h4><p>Your Airy instance comes without pre-configured security settings to make it easy for you to run Airy on your local machine or in a safe environment. However, if you run your Airy instance in a place that is publicly reachable from the internet, like any cloud provider such as AWS, the first thing you should do is to properly secure your Airy instance.</p><p> &#x26A0;&#xFE0F;<strong>Warning&#xFE0F;&#xFE0F;&#xFE0F;:</strong> The Airy Core API as well as the Airy Inbox UI running at <code>YOUR_PUBLIC_AIRY_URL/ui</code> will be publicly accessible unless you implement the following steps. Please make sure to properly secure your Airy instance before you connect conversational sources or stream actual customer conversations to your Airy instance. &#x26A0;&#xFE0F;</p><p>Open <code>airy.yaml</code> with an editor of your choice and add a system token via a new <code>systemToken</code> parameter as well as an authentication provider of your choice by specifying an Open ID Connect configuration via the <code>oidc</code> parameter. We support any authentication provider that follows the <a href="https://openid.net/connect/">Open ID Connect</a> standard.</p><p>For example, if you decided to use Github as an authentication provider, you would need to implement the following steps:</p><ul><li>Look up the public URL of your Airy instance by running the command <code>airy api endpoint</code> in Terminal. &#xA0; </li><li><a href="https://docs.github.com/en/developers/apps/creating-an-oauth-app">Create a Github OAuth App</a> with an Authorization Callback URL in the form of <code>&lt;YOUR_PUBLIC_AIRY_URL&gt;/login/oauth2/code/github</code> &#xA0;</li><li>Edit the the relevant section of your <code>airy.yaml</code> file by copying the relevant values for Github Client ID and Github Client Secret from the previous step. Additionally, specify the required parameter <code>allowedEmailPatterns</code> by giving a list of email addresses or a pattern of email addresses of Github users such as <code>*@example.com</code>, so not every Github user will be able to access your Airy instance. The security section of your <code>airy.yaml</code> file could, for example, look as follows:</li></ul><pre><code class="language-YAML">security:
  allowedOrigins: &apos;*&apos;
  jwtSecret: &lt;SOME_SECRET&gt;
  systemToken: &lt;SOME_SUPER_SECURE_TOKEN&gt;
  oidc:
  	allowedEmailPatterns: &quot;*@example.com,john@example.com&quot;
    provider: &quot;github&quot;
    clientId: &quot;&lt;YOUR_GITHUB_OAUTH_CLIENT_ID&gt;&quot;
    clientSecret: &quot;&lt;YOUR_GITHUB_OAUTH_CLIENT_SECRET&gt;&quot;</code></pre><ul><li>Finally, apply the configuration by running the following command in Terminal:</li></ul><pre><code class="language-shell">airy config apply</code></pre><ul><li>Try out to query the Airy Core API by sending the following request in Terminal: </li></ul><figure class="kg-card kg-code-card"><pre><code class="language-shell">curl --request POST \
  --url &lt;YOUR_PUBLIC_AIRY_URL&gt;/conversations.list \
  --header &apos;Authorization: Bearer &lt;SOME_SUPER_SECURE_TOKEN&gt;&apos; \
  --header &apos;Content-Type: application/json&apos;</code></pre><figcaption>Expected Response - Status Code: 200, Body: { data: [] }</figcaption></figure><ul><li>You can also try out to access the Airy Inbox UI in any modern web browser at <code>&lt;YOUR_PUBLIC_AIRY_URL&gt;/ui/</code> which should lead you successfully to the Github OAuth flow and forward you back to your Airy instance upon successful authentication. </li></ul><hr><h3 id="4-connect-your-conversational-sources-and-start-to-utilize-conversational-data">4. Connect your Conversational Sources and start to utilize Conversational Data</h3><p><br>You are now ready to connect conversational sources such as Facebook Messenger, Google Business Messages, WhatsApp, etc. to your Airy instance. </p><p>Please read the <a href="https://airy.co/docs/core/sources/introduction">relevant documentation on how to configure conversational sources</a>, so you can start to connect channels and stream conversations and messages to your new Airy instance. &#xA0; &#xA0; &#xA0;</p>]]></content:encoded></item><item><title><![CDATA[Release 0.17.0: End Chat, Quick Replies and Expanded Chat Plugin Default]]></title><description><![CDATA[This release significantly expanded the capabilities of your Airy Chat Plugin. Prior to this release, the possibility to end the chat and begin a new one was not available. Quick replies became easily accessible and the behavior of the Chat Plugin when a contact resumes a conversation improved.]]></description><link>https://blog.airy.co/release-17/</link><guid isPermaLink="false">60b8b9b66f273b7b1bb7f369</guid><category><![CDATA[Product Updates]]></category><dc:creator><![CDATA[Liz Hutter]]></dc:creator><pubDate>Tue, 13 Apr 2021 09:23:00 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/08/photo-1516321497487-e288fb19713f-1.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/08/photo-1516321497487-e288fb19713f-1.jpg" alt="Release 0.17.0: End Chat, Quick Replies and Expanded Chat Plugin Default"><p>This release significantly expanded the capabilities of your Airy Chat Plugin. Prior to this release, the possibility to end the chat and begin a new one was not available. In addition to this, quick replies became easily accessible and the behavior of the Chat Plugin when a contact resumes a conversation improved.</p><p></p><h3 id="end-chat-option">End Chat Option</h3><p>The End Chat option was introduced to the Airy Chat Plugin mostly for security and privacy reasons. Without this feature, conversation history (and possibly personal data) would be readily available, including on a public or shared computer. However, contacts are now able to end the current chat session and exit the website, or simply create a new chat immediately after.<br></p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://blog.airy.co/content/images/2021/06/screenshot_2021-06-02_at_17.07.42-1.png" width="379" height="705" loading="lazy" alt="Release 0.17.0: End Chat, Quick Replies and Expanded Chat Plugin Default"></div><div class="kg-gallery-image"><img src="https://blog.airy.co/content/images/2021/06/screenshot_2021-06-02_at_17.07.47-1.png" width="381" height="706" loading="lazy" alt="Release 0.17.0: End Chat, Quick Replies and Expanded Chat Plugin Default"></div></div></div></figure><p>With the End Chat option, contacts do not have to worry about any personal data being accessible by any third party members of the conversation. They can opt to have their conversations saved, or they can start a new one every time they enter the website.</p><p></p><h3 id="quick-replies-for-chat-plugin">Quick Replies for Chat Plugin</h3><p>This feature makes it easy for both contacts and system users to quickly reply to a question posted by the other person. The quick replies only display when the associated message is the most recent message within the conversation. Once one has been chosen and sent, the row of quick replies disappears until another message contains quick replies.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/quick-replies.png" class="kg-image" alt="Release 0.17.0: End Chat, Quick Replies and Expanded Chat Plugin Default" loading="lazy" width="386" height="704"></figure><p>These suggestions (buttons) are displayed horizontally below the plain text that prompts them. As a guideline, you can have a maximum of 13 suggestions with a maximum of 25 characters each.</p><p></p><h3 id="expanded-chat-plugin-by-default">Expanded Chat Plugin by Default</h3><p>With Airy Live Chat, contacts are able to resume conversations, if they choose to do so, at a later time with a resume token. To make it more convenient, we decided to expand the Chat Plugin by default when continuing that conversation. In addition, contacts are already in the input bar of the chat, so they do not need to navigate anywhere, but instead can start typing immediately.</p><p></p><h2 id="%F0%9F%9A%80-features">&#x1F680; Features</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/929">#929</a>] Implement the option to end chat (<a href="https://github.com/airyhq/airy/pull/1508">#1508</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1110">#1110</a>] Add basic and advance customization to chatplugin docs (<a href="https://github.com/airyhq/airy/pull/1494">#1494</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1290">#1290</a>] Prometheus Metrics about Spring apps (<a href="https://github.com/airyhq/airy/pull/1479">#1479</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1473">#1473</a>] Make release process more quiet (<a href="https://github.com/airyhq/airy/pull/1501">#1501</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1004">#1004</a>] Enable quickreplies for chatplugin (<a href="https://github.com/airyhq/airy/pull/1478">#1478</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/572">#572</a>] Cleanup senderType code (<a href="https://github.com/airyhq/airy/pull/1490">#1490</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1474">#1474</a>] Added showmode flag that blocks functionality in chat plugin (<a href="https://github.com/airyhq/airy/pull/1475">#1475</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/572">#572</a>] Simplify senderType (<a href="https://github.com/airyhq/airy/pull/1458">#1458</a>)</li></ul><h2 id="%F0%9F%90%9B-bug-fixes">&#x1F41B; Bug Fixes</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1521">#1521</a>] Import ChatPlugin header component assets from library (<a href="https://github.com/airyhq/airy/pull/1522">#1522</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1438">#1438</a>] Fix logout when a user sends a message to a conversation from a disconnected channel (<a href="https://github.com/airyhq/airy/pull/1457">#1457</a>)</li></ul><h2 id="%F0%9F%93%9A-documentation">&#x1F4DA; Documentation</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1408">#1408</a>] Add missing tag gifs (<a href="https://github.com/airyhq/airy/pull/1496">#1496</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1422">#1422</a>] AWS Docs Revamp (<a href="https://github.com/airyhq/airy/pull/1487">#1487</a>)</li></ul><h2 id="%F0%9F%A7%B0-maintenance">&#x1F9F0; Maintenance</h2><ul><li>Remove empty payloads (<a href="https://github.com/airyhq/airy/pull/1509">#1509</a>)</li><li>Bump css-loader from 5.2.0 to 5.2.1 (<a href="https://github.com/airyhq/airy/pull/1514">#1514</a>)</li><li>Bump webpack from 5.31.0 to 5.31.2 (<a href="https://github.com/airyhq/airy/pull/1513">#1513</a>)</li><li>Bump eslint from 7.23.0 to 7.24.0 (<a href="https://github.com/airyhq/airy/pull/1512">#1512</a>)</li><li>Move back components to the mono repo (<a href="https://github.com/airyhq/airy/pull/1506">#1506</a>)</li><li>Bump @babel/preset-env from 7.13.12 to 7.13.15 (<a href="https://github.com/airyhq/airy/pull/1498">#1498</a>)</li><li>Bump @babel/core from 7.13.14 to 7.13.15 (<a href="https://github.com/airyhq/airy/pull/1499">#1499</a>)</li><li>Bump eslint-plugin-react from 7.23.1 to 7.23.2 (<a href="https://github.com/airyhq/airy/pull/1500">#1500</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1466">#1466</a>] Follow up on extract model (<a href="https://github.com/airyhq/airy/pull/1493">#1493</a>)</li><li>Bump cypress from 7.0.0 to 7.0.1 (<a href="https://github.com/airyhq/airy/pull/1481">#1481</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1466">#1466</a>] Extract model lib from httpclient (<a href="https://github.com/airyhq/airy/pull/1488">#1488</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1476">#1476</a>] Remove components (<a href="https://github.com/airyhq/airy/pull/1485">#1485</a>)</li><li>Bump core-js from 3.10.0 to 3.10.1 (<a href="https://github.com/airyhq/airy/pull/1484">#1484</a>)</li><li>Bump webpack from 5.30.0 to 5.31.0 (<a href="https://github.com/airyhq/airy/pull/1483">#1483</a>)</li><li>Bump @bazel/typescript from 3.2.3 to 3.3.0 (<a href="https://github.com/airyhq/airy/pull/1482">#1482</a>)</li><li>Bump copy-webpack-plugin from 8.1.0 to 8.1.1 (<a href="https://github.com/airyhq/airy/pull/1469">#1469</a>)</li><li>Bump emoji-mart from 3.0.0 to 3.0.1 (<a href="https://github.com/airyhq/airy/pull/1507">#1507</a>)</li><li>Fix hot module replacement (<a href="https://github.com/airyhq/airy/pull/1480">#1480</a>)</li></ul><h2 id="airy-cli">Airy CLI</h2><p>You can download the Airy CLI for your operating system from the following links:</p><p><a href="https://airy-core-binaries.s3.amazonaws.com/0.17.0/darwin/amd64/airy" rel="nofollow">MacOS</a><br><a href="https://airy-core-binaries.s3.amazonaws.com/0.17.0/linux/amd64/airy" rel="nofollow">Linux</a><br><a href="https://airy-core-binaries.s3.amazonaws.com/0.17.0/windows/amd64/airy.exe" rel="nofollow">Windows</a></p>]]></content:encoded></item><item><title><![CDATA[Release 0.16.0: More Types of Suggested Replies, Lightbulb Icon for Previous Suggested Replies, Chat Plugin Customization]]></title><description><![CDATA[This release focused strongly on the Airy Live Chat Plugin and evolving its available features. Suggested Replies became widely useful, especially with the possibility of supporting more types of replies. We also gave system users the power to customize their entire Airy Live Chat Plugin.]]></description><link>https://blog.airy.co/release-0-16-0/</link><guid isPermaLink="false">60bd06506f273b7b1bb7f401</guid><category><![CDATA[Product Updates]]></category><dc:creator><![CDATA[Liz Hutter]]></dc:creator><pubDate>Wed, 07 Apr 2021 13:57:00 GMT</pubDate><media:content url="https://blog.airy.co/content/images/2021/08/photo-1556401615-c909c3d67480.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.airy.co/content/images/2021/08/photo-1556401615-c909c3d67480.jpg" alt="Release 0.16.0: More Types of Suggested Replies, Lightbulb Icon for Previous Suggested Replies, Chat Plugin Customization"><p>This release focused strongly on the Airy Live Chat Plugin and evolving its available features. Suggested Replies became widely useful, especially with the possibility of supporting more types of replies other than simple text. We also gave system users the power to customize their entire Airy Live Chat Plugin, whether it be in regards to text, colors, or icons.</p><p></p><h3 id="add-more-types-of-suggested-replies">Add More Types of Suggested Replies</h3><p>Prior to Release 0.16.0, the typing of Suggested Replies only marked text as supported type. However with this release we now support text, images, Rich Cards, and Rich Card Carousels.</p><p></p><h3 id="display-lightbulb-icon-for-previously-suggested-replies">Display Lightbulb Icon for Previously Suggested Replies</h3><p>Suggested Replies were introduced to the Airy Live Chat Plugin with Release 0.14.0. However, in addition to more types of Suggested Replies being rendered, this release also created the option of using Suggested Replies for previous messages as well. Normally, we can show the Suggested Replies on top of the input bar if one of the last 5 messages has Suggested Replies. If none of the last 5 messages contain Suggested Replies, we will only show the lightbulb icon.</p><figure class="kg-card kg-image-card"><img src="https://blog.airy.co/content/images/2021/06/lightbulb-icon.png" class="kg-image" alt="Release 0.16.0: More Types of Suggested Replies, Lightbulb Icon for Previous Suggested Replies, Chat Plugin Customization" loading="lazy" width="1600" height="834" srcset="https://blog.airy.co/content/images/size/w600/2021/06/lightbulb-icon.png 600w, https://blog.airy.co/content/images/size/w1000/2021/06/lightbulb-icon.png 1000w, https://blog.airy.co/content/images/2021/06/lightbulb-icon.png 1600w" sizes="(min-width: 720px) 720px"></figure><p>The best part of this is: this feature is not limited to a certain amount of messages within a conversation, but rather all messages that contain Suggested Replies are affected. As seen in the image above, a small lightbulb icon appears next to the older message because it is affected. The suggestions above include text, images, and Rich Cards.</p><p></p><h3 id="live-chat-plugin-customization-options">Live Chat Plugin Customization Options</h3><p>This release greatly increased the customization options of your Airy Chat Plugin. In the first few releases, customization was extremely limited.</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/biAzAgbDXGw?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><p>It is our pleasure to introduce the new customizable aspects of the Chat Plugin, including but not limited to:</p><ul><li>Background color</li><li>Chat Plugin icon (used as the button to open the Chat Plugin)</li><li>Header text (visible on the top of your expanded Airy Chat Plugin)The best part of this feature is that once you have customized your Chat Plugin to your needs and wants, you can copy the code directly from the left hand side into the<strong> &lt;head&gt;</strong> of your website.</li></ul><p></p><h2 id="%F0%9F%9A%80-features">&#x1F680; Features</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1111">#1111</a>] Customize Chat Plugin (<a href="https://github.com/airyhq/airy/pull/1456">#1456</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1384">#1384</a>] Add more types suggested replies (<a href="https://github.com/airyhq/airy/pull/1420">#1420</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1293">#1293</a>] Add Prometheus doc (<a href="https://github.com/airyhq/airy/pull/1448">#1448</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1244">#1244</a>] Display lightbulb icon for previous&#x2026; (<a href="https://github.com/airyhq/airy/pull/1424">#1424</a>)</li></ul><h2 id="%F0%9F%90%9B-bug-fixes">&#x1F41B; Bug Fixes</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1310">#1310</a>] Airy CLI sha changes after the release (<a href="https://github.com/airyhq/airy/pull/1443">#1443</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1452">#1452</a>] Show tags in contact info column (<a href="https://github.com/airyhq/airy/pull/1454">#1454</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1455">#1455</a>] Configure lucene so queries are case insensitive (<a href="https://github.com/airyhq/airy/pull/1463">#1463</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1304">#1304</a>] Wait for core components during create (<a href="https://github.com/airyhq/airy/pull/1442">#1442</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/925">#925</a>] Fix examples (<a href="https://github.com/airyhq/airy/pull/1441">#1441</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1413">#1413</a>] expand chat plugin by default (<a href="https://github.com/airyhq/airy/pull/1436">#1436</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1450">#1450</a>] Fix conversation counter (<a href="https://github.com/airyhq/airy/pull/1451">#1451</a>)</li></ul><h2 id="%F0%9F%93%9A-documentation">&#x1F4DA; Documentation</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1422">#1422</a>] Add section for kubectl (<a href="https://github.com/airyhq/airy/pull/1445">#1445</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1406">#1406</a>] live chat docs quickstart (<a href="https://github.com/airyhq/airy/pull/1440">#1440</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1404">#1404</a>] added intro to sources (<a href="https://github.com/airyhq/airy/pull/1444">#1444</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1439">#1439</a>] Update release process with hotfix doc (<a href="https://github.com/airyhq/airy/pull/1449">#1449</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1403">#1403</a>] CLI Docs Revamp (<a href="https://github.com/airyhq/airy/pull/1426">#1426</a>)</li></ul><h2 id="%F0%9F%A7%B0-maintenance">&#x1F9F0; Maintenance</h2><ul><li>[<a href="https://github.com/airyhq/airy/issues/1164">#1164</a>] Document and improve message upsert endpoint (<a href="https://github.com/airyhq/airy/pull/1468">#1468</a>)</li><li>Readme - now with nice graph-ical improvements (<a href="https://github.com/airyhq/airy/pull/1377">#1377</a>)</li><li>[<a href="https://github.com/airyhq/airy/issues/1466">#1466</a>] Prepare the codebase for lib extraction (<a href="https://github.com/airyhq/airy/pull/1467">#1467</a>)</li><li>Bump cypress from 6.8.0 to 7.0.0 (<a href="https://github.com/airyhq/airy/pull/1461">#1461</a>)</li><li>Bump @typescript-eslint/parser from 4.20.0 to 4.21.0 (<a href="https://github.com/airyhq/airy/pull/1460">#1460</a>)</li><li>Bump @bazel/ibazel from 0.15.6 to 0.15.8 (<a href="https://github.com/airyhq/airy/pull/1464">#1464</a>)</li><li>Bump webpack from 5.28.0 to 5.30.0 (<a href="https://github.com/airyhq/airy/pull/1459">#1459</a>)</li><li>Bump @typescript-eslint/eslint-plugin from 4.20.0 to 4.21.0 (<a href="https://github.com/airyhq/airy/pull/1462">#1462</a>)</li><li>Bump @typescript-eslint/eslint-plugin from 4.19.0 to 4.20.0 (<a href="https://github.com/airyhq/airy/pull/1446">#1446</a>)</li><li>Bump eslint from 7.22.0 to 7.23.0 (<a href="https://github.com/airyhq/airy/pull/1447">#1447</a>)</li><li>Remove Airy init and restructure cli (<a href="https://github.com/airyhq/airy/pull/1414">#1414</a>)</li><li>Bump @typescript-eslint/parser from 4.19.0 to 4.20.0 (<a href="https://github.com/airyhq/airy/pull/1434">#1434</a>)</li><li>Bump core-js from 3.9.1 to 3.10.0 (<a href="https://github.com/airyhq/airy/pull/1435">#1435</a>)</li><li>Bump @bazel/ibazel from 0.14.0 to 0.15.6 (<a href="https://github.com/airyhq/airy/pull/1433">#1433</a>)</li><li>Bump @babel/core from 7.13.10 to 7.13.14 (<a href="https://github.com/airyhq/airy/pull/1432">#1432</a>)</li><li>Bump webpack-cli from 4.5.0 to 4.6.0 (<a href="https://github.com/airyhq/airy/pull/1431">#1431</a>)</li></ul><h2 id="airy-cli">Airy CLI</h2><p>You can download the Airy CLI for your operating system from the following links:</p><p><a href="https://airy-core-binaries.s3.amazonaws.com/0.16.0/darwin/amd64/airy" rel="nofollow">MacOS</a><br><a href="https://airy-core-binaries.s3.amazonaws.com/0.16.0/linux/amd64/airy" rel="nofollow">Linux</a><br><a href="https://airy-core-binaries.s3.amazonaws.com/0.16.0/windows/amd64/airy.exe" rel="nofollow">Windows</a></p>]]></content:encoded></item></channel></rss>