Bard is a session replay tool that can be auto-deployed to AWS or set up using a docker-compose. Bard includes a session recorder and a session replayer. This tool will allow you to easily create user conversion funnels and view exactly why a user did not complete the conversion while also identifying sessions that experience errors on the front end of your application.
- an AWS account
npm
is installed- the AWS CLI is installed and configured
- an AWS named profile
- the AWS CDK command-line tool is installed
- Docker is installed
First, install the npm package into your project.
npm install bardrr
In your project, import the Agent
front of the bardrr
npm package. Then run the start
method on the Agent passing an object as the argument with
the properties appName
which will be the name of this specific application, and endpoint
which is the location of the replayer. Example:
import Agent from "bardrr"
new Agent().start({appName: "Party App", endpoint: "http://www.myfancyapp.com"});
More Information can be found Here
Download the docker compose file on your infrastructure that can be found Here.
Run docker compose up
The replayer will then be running on port 3003.
The agent acts as the recorder for our application. It collects a snapshot of the original DOM and the events that mutate the DOM to make a recording of the session. These events are correlated with a session ID given to them by the agent. You can learn more here.
The agent API serves as the ingest point for our application. It first authenticates the agent using JSON web tokens, then begins to collect and parse the session metadata and the events from the agent. The session metadata is stored in the Postgres database while the events moved into the RabbitMQ. You can learn more here.
The Postgres Database is used in two distinct ways. First, it is the temporary storage location for the session metadata while the session is still active. Second, it is used as the storage location for funnel information used by the Replayer. You can see the SQL schema here.
The RabbitMQ is the connection between the Agent API and the Clickhouse Database. The amount of data coming in from the Agent is very large and needs a place to queue while the Clickhouse Database is inserting the data in the event table. RabbitMQ gives the data a place to wait for it to be inserted into the database. You can learn more about RabbitMQ here.
The Session Ender is a cron job that moves pending sessions from the Postgres database to the Clickhouse database. You can learn more here.
The Clickhouse Database is the final resting place for all data related to the sessions. It holds all of the metadata and event recordings for all sessions. You can see the Clickhouse schema here.
This is the user interface for our application. The UI allows users to view all sessions, filter sessions based on different criteria, and create user conversion funnels that allow you to know exactly where users fell out of the funnel and did not complete the conversion. You can learn more about Replayer here.
You can learn more about us and our tool here.