Ibexa DXP Installation with Redis, Postgresql and Elasticsearch

Ibexa DXP version 3.3.2 April 2021
Level: beginner

Next Posts:
* How to move a local installation to Ibexa Cloud
* Using Lando for local development

This post explains how to install Ibexa DXP Experience version 3.3.2 using Postgresql as database and Elastic as search engine and It is the same path to follow to install Ibexa content. The Ibexa documentation is enough to follow but this post is focusing on the Experience Edition and serves as a quick guide.

Note: Ibexa commerce requires Solr and this might not be possible to use Elasticsearch. Be careful if you want to use some free available commerce features in Experience Edition e.g autosuggestion

The installation of Ibexa 3.3.2 is slightly different compared to the previous 3.0 until 3.2 versions. It introduces a new skeleton concept and the installation is very faster and fixes older v3 tags installation with incorrect package versions.

All preparation steps like Node.js, yarn, composer and Token generation are good documented in: Install Ibexa 3.3

let’s get started

Environement
For the installation I setup below docker images from Dockerhub and put all recipies in the docker-compose file:

Note: The goal of this post is not the Environement setups as you can have other preferences for the infrastructure.

Create project

Git setups

Dev environement file

Postgresql configuration
At this step you can create a copy of the .env file for the dev environment .env.local and add a database block at the end for better readability.

Redis configuration
Below configuration take into account sessions and persistent cache.
ℹ️ Ideally keep both separated as Redis will then start to refuse new entries once full, including new sessions

.env / .env.local

Adapt the yaml configuration:

config/packages/ezplatform.yaml

config/packages/framework.yaml

Once the installation is done (see “Install Ibexa DXP” section below) and to be sure that data are stored in Redis you can review the cache used in the symfony Profiler toolbar or just monitor the data in redis:

For sessions be sure that session are saved in redis

ℹ️ you can check the session cookie saved in the browser and compare it to the value in redis

Note: If you flush redis cache you are automatically logged out.

Create APP SECRET
below you have different ways to create the APP_SECRET

Adapt the APP_SECRET value in the .env as well as in .env.local

Install Ibexa DXP and create a database

Output:

ℹ️ later we will use elastic instead of default legacy search engine

If you have installed pgadmin you can check the different table created

pgadmin

Generate graphql schema

Run post scripts

Run Symfony Server using php7.4

Open the browser and go to 127.0.0.1:8000

Backend Login
Browse to 127.0.0.1:8000/admin and login with default credentials:
username: admin
password: publish

In the admin > System Information you will see something like below screen:

Ibexa DXP 3.3.2 Experience

Elasticsearch configuration
Switch from legacy to Elasticsearch
.env / .env.local

Note:At this step the content will not be correctly displayed in the backend

Push the templates

Reindex the database

you can get some information about the content using the CLI:

or using the Kibana console:

Kibana UI

At this step the content is available again in the backend.