Adding ability to start and plumb splunk instance (#12183)

This commit is contained in:
John Westcott IV 2022-05-09 09:50:28 -04:00 committed by GitHub
parent 385a94866c
commit a86740c3c9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 113 additions and 2 deletions

View File

@ -15,6 +15,8 @@ MAIN_NODE_TYPE ?= hybrid
KEYCLOAK ?= false
# If set to true docker-compose will also start an ldap instance
LDAP ?= false
# If set to true docker-compose will also start a splunk instance
SPLUNK ?= false
VENV_BASE ?= /var/lib/awx/venv
@ -466,7 +468,8 @@ docker-compose-sources: .git/hooks/pre-commit
-e execution_node_count=$(EXECUTION_NODE_COUNT) \
-e minikube_container_group=$(MINIKUBE_CONTAINER_GROUP) \
-e enable_keycloak=$(KEYCLOAK) \
-e enable_ldap=$(LDAP)
-e enable_ldap=$(LDAP) \
-e enable_splunk=$(SPLUNK)
docker-compose: awx/projects docker-compose-sources

View File

@ -245,6 +245,7 @@ $ make docker-compose
- [Start with Minikube](#start-with-minikube)
- [Keycloak Integration](#keycloak-integration)
- [OpenLDAP Integration](#openldap-integration)
- [Splunk Integration](#splunk-integration)
### Start a Shell
@ -406,7 +407,7 @@ LDAP=true make docker-compose
Once the containers come up two new ports (389, 636) should be exposed and the LDAP server should be running on those ports. The first port (389) is non-SSL and the second port (636) is SSL enabled.
Now we are ready to configure and plumb OpenLDAP with AWX. To do this we have provided a playbook which will:
* Backup and configure the LDAP adapter in AWX. NOTE: this will back up your existing settings but the password fields can not be backuped through the API, you need a DB backup to recover this.
* Backup and configure the LDAP adapter in AWX. NOTE: this will back up your existing settings but the password fields can not be backed up through the API, you need a DB backup to recover this.
Note: The default configuration will utilize the non-tls connection. If you want to use the tls configuration you will need to work through TLS negotiation issues because the LDAP server is using a self signed certificate.
@ -427,3 +428,34 @@ Once the playbook is done running LDAP should now be setup in your development e
4. awx_ldap_org_admin:orgadmin123
The first account is a normal user. The second account will be a super user in AWX. The third account will be a system auditor in AWX. The fourth account is an org admin. All users belong to an org called "LDAP Organization". To log in with one of these users go to the AWX login screen enter the username/password.
### Splunk Integration
Splunk is a log aggregation tool that can be used to test AWX with external logging integration. This section describes how to build a reference Splunk instance and plumb it with your AWX for testing purposes.
First, be sure that you have the awx.awx collection installed by running `make install_collection`.
Next, install the splunk.es collection by running `ansible-galaxy collection install splunk.es`.
Anytime you want to run a Splunk instance alongside AWX we can start docker-compose with the SPLUNK option to get a Splunk instance with the command:
```bash
SPLUNK=true make docker-compose
```
Once the containers come up three new ports (8000, 8089 and 9199) should be exposed and the Splunk server should be running on some of those ports (the 9199 will be created later by the plumbing playbook). The first port (8000) is the non-SSL admin port and you can log into splunk with the credentials admin/splunk_admin. The url will be like http://<server>:8000/ this will be referenced below. The 8089 is the API port that the ansible modules will use to connect to and configure splunk. The 9199 port will be used to construct a TCP listener in Splunk that AWX will forward messages to.
Once the containers are up we are ready to configure and plumb Splunk with AWX. To do this we have provided a playbook which will:
* Backup and configure the External Logging adapter in AWX. NOTE: this will back up your existing settings but the password fields can not be backed up through the API, you need a DB backup to recover this.
* Create a TCP port in Splunk for log forwarding
For routing traffic between AWX and Splunk we will use the internal docker compose network. The `Logging Aggregator` will be configured using the internal network machine name of `splunk`.
Once you have have the collections installed (from above) you can run the playbook like:
```bash
export CONTROLLER_USERNAME=<your username>
export CONTROLLER_PASSWORD=<your password>
ansible-playbook tools/docker-compose/ansible/plumb_splunk.yml
```
Once the playbook is done running Splunk should now be setup in your development environment. You can log into the admin console (see above for username/password) and click on "Searching and Reporting" in the left hand navigation. In the search box enter `source="http:tower_logging_collections"` and click search.

View File

@ -0,0 +1,51 @@
---
- name: Plumb a splunk instance
hosts: localhost
connection: local
gather_facts: False
vars:
awx_host: "https://localhost:8043"
collections:
- splunk.es
tasks:
- name: create splunk_data_input_network
splunk.es.data_input_network:
name: "9199"
protocol: "tcp"
source: "http:tower_logging_collections"
sourcetype: "httpevent"
state: "present"
vars:
ansible_network_os: splunk.es.splunk
ansible_user: admin
ansible_httpapi_pass: splunk_admin
ansible_httpapi_port: 8089
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: False
ansible_connection: httpapi
- name: Load existing and new Logging settings
set_fact:
existing_logging: "{{ lookup('awx.awx.controller_api', 'settings/logging', host=awx_host, verify_ssl=false) }}"
new_logging: "{{ lookup('template', 'logging.json.j2') }}"
- name: Display existing Logging configuration
debug:
msg:
- "Here is your existing SAML configuration for reference:"
- "{{ existing_logging }}"
- pause:
prompt: "Continuing to run this will replace your existing logging settings (displayed above). They will all be captured except for your connection password. Be sure that is backed up before continuing"
- name: Write out the existing content
copy:
dest: "../_sources/existing_logging.json"
content: "{{ existing_logging }}"
- name: Configure AWX logging adapter
awx.awx.settings:
settings: "{{ new_logging }}"
controller_host: "{{ awx_host }}"
validate_certs: False

View File

@ -27,3 +27,5 @@ ldap_diff_dir: '{{ sources_dest }}/ldap_diffs'
ldap_public_key_file: '{{ ldap_cert_dir }}/{{ ldap_public_key_file_name }}'
ldap_private_key_file: '{{ ldap_cert_dir }}/{{ ldap_private_key_file_name }}'
ldap_cert_subject: "/C=US/ST=NC/L=Durham/O=awx/CN="
enable_splunk: false

View File

@ -122,6 +122,19 @@ services:
- 'openldap_data:/bitnami/openldap'
- '../../docker-compose/_sources/ldap_certs:/opt/bitnami/openldap/certs'
- '../../docker-compose/_sources/ldap_diffs:/opt/bitnami/openldap/ldiffs'
{% endif %}
{% if enable_splunk|bool %}
splunk:
image: splunk/splunk:latest
container_name: tools_splunk_1
hostname: splunk
ports:
- "8000:8000"
- "8089:8089"
- "9199:9199"
environment:
SPLUNK_START_ARGS: --accept-license
SPLUNK_PASSWORD: splunk_admin
{% endif %}
# A useful container that simply passes through log messages to the console
# helpful for testing awx/tower logging

View File

@ -0,0 +1,10 @@
{
"LOG_AGGREGATOR_HOST": "splunk",
"LOG_AGGREGATOR_PORT": 9199,
"LOG_AGGREGATOR_TYPE": "splunk",
"LOG_AGGREGATOR_USERNAME": "admin",
"LOG_AGGREGATOR_PASSWORD": "splunk_admin",
"LOG_AGGREGATOR_ENABLED": true,
"LOG_AGGREGATOR_PROTOCOL": "tcp",
"LOG_AGGREGATOR_VERIFY_CERT": false,
}