Consolidate the Local Docker installer and the dev env

- removes local_docker installer and points community users to our development environment (make docker-compose)
  - provides a migration path from Local Docker Compose installations --> the dev environment
  - the dev env can now be configured to use an external database
  - consolidated the Local Docker and dev env docker-compose.yml files into one template file, used by the dockerfile role
  - added a 'sources' role to template out config files
  - the postgres data dir is no longer a bind-mount, it is a docker volume
  - the redis socket is not longer a bind-mount, it is a docker volume
  - the local_settings.py.docker-compose file no longer needs to be copied over in the dev env
  - Create tmp rsyslog.conf in rsyslog volume to avoid cross-linking. Previously, the tmp code-generated rsyslog.conf was being written to /tmp (by default).  As a result, we were attempting to shutil.move() across volumes.
  - move k8s image build and push roles under tools/ansible
  - See tools/docker-compose/README.md for usage of these changes
This commit is contained in:
Christian M. Adams
2021-01-27 11:01:17 -05:00
parent 0f6d2c36a0
commit 9672e72834
52 changed files with 1325 additions and 808 deletions

View File

@@ -0,0 +1,60 @@
# Migrating Data from Local Docker
If you are migrating data from a Local Docker installation (17.0.1 and prior), you can
migrate your data to the development environment via the migrate.yml playbook, or by using the manual steps described below.
> Note: This will also convert your postgresql bind-mount into a docker volume.
### Migrate data with migrate.yml
If you had a custom pgdocker or awxcompose location, you will need to set the `postgres_data_dir` and `old_docker_compose_dir` variables.
1. Run the [migrate playbook](./ansible/migrate.yml) to migrate your data to the new postgresql container and convert the data directory to a volume mount.
```bash
$ ansible-playbook migrate.yml -e "migrate_local_docker=true" -e "postgres_data_dir=~/.awx/pgdocker" -e "old_docker_compose_dir=~/.awx/awxcompose"
```
2. Change directory to the top of your awx checkout and start your containers
```bash
$ make docker-compose
```
3. After ensuring your data has been successfully migrated, you may delete your old data directory (typically stored at `~/.awx/pgdocker`).
### Migrating data manually
1. With Local Docker still running, perform a pg_dumpall:
> Note: If Local Docker is no longer running
`docker-compose -f ~/.awx/awxcompose/docker-compose.yml up postgres`
```bash
$ docker-compose -f ~/.awx/awxcompose/docker-compose.yml exec postgres pg_dumpall -U awx > awx_dump.sql
```
2. Remove all local docker containers (specifically awx_postgres)
```bash
$ docker -f rm awx_postgres
```
3. Template the new docker-compose.yml
```bash
$ ansible-playbook -i tools/ansible/inventory tools/ansible/sources.yml
```
4. Start a container with a volume (using the new tools/docker-compose/_sources/docker-compose.yml)
```bash
$ docker-compose -f ../docker-compose/_sources/docker-compose.yml up postgres
```
5. Restore to new `awx_postgres`
```bash
$ docker-compose -f ../docker-compose/_sources/docker-compose.yml exec -T postgres psql -U awx -d awx -p 5432 < awx_dump.sql
```
6. Run the docker-compose.yml to start the containers
```bash
$ docker-compose -f ../docker-compose/_sources/docker-compose.yml up task
```
7. Check to ensure your data migration was successful, then you can delete your the `awx_dump.sql` backup and your old data directory.

View File

@@ -0,0 +1,49 @@
# How to use the logstash container
#### Modify the docker-compose.yml
Uncomment the following lines in the `docker-compose.yml`
```
#- logstash
...
#logstash:
# build:
# context: ./docker-compose
# dockerfile: Dockerfile-logstash
```
POST the following content to `/api/v2/settings/logging/` (this uses
authentication set up inside of the logstash configuration file).
```
{
"LOG_AGGREGATOR_HOST": "http://logstash",
"LOG_AGGREGATOR_PORT": 8085,
"LOG_AGGREGATOR_TYPE": "logstash",
"LOG_AGGREGATOR_USERNAME": "awx_logger",
"LOG_AGGREGATOR_PASSWORD": "workflows",
"LOG_AGGREGATOR_LOGGERS": [
"awx",
"activity_stream",
"job_events",
"system_tracking"
],
"LOG_AGGREGATOR_INDIVIDUAL_FACTS": false,
"LOG_AGGREGATOR_TOWER_UUID": "991ac7e9-6d68-48c8-bbde-7ca1096653c6",
"LOG_AGGREGATOR_ENABLED": true
}
```
> Note: HTTP must be specified in the `LOG_AGGREGATOR_HOST` if you are using the docker development environment.
An example of how to view the most recent logs from the container:
```
docker exec -i -t $(docker ps -aqf "name=tools_logstash_1") tail -n 50 /logstash.log
```
#### How to add logstash plugins
Add any plugins you need in `tools/elastic/logstash/Dockerfile` before running the container.