mirror of
https://github.com/ansible/awx.git
synced 2026-01-10 15:32:07 -03:30
Merge branch 'devel' into patch-1
This commit is contained in:
commit
9e981583a6
17
.github/BOTMETA.yml
vendored
17
.github/BOTMETA.yml
vendored
@ -1,17 +0,0 @@
|
||||
---
|
||||
files:
|
||||
awx/ui/:
|
||||
labels: component:ui
|
||||
maintainers: $team_ui
|
||||
awx/api/:
|
||||
labels: component:api
|
||||
maintainers: $team_api
|
||||
awx/main/:
|
||||
labels: component:api
|
||||
maintainers: $team_api
|
||||
installer/:
|
||||
labels: component:installer
|
||||
|
||||
macros:
|
||||
team_api: wwitzel3 matburt chrismeyersfsu cchurch AlanCoding ryanpetrello rooftopcellist
|
||||
team_ui: jlmitch5 jaredevantabor mabashian marshmalien benthomasson jakemcdermott
|
||||
1
.github/CODEOWNERS
vendored
1
.github/CODEOWNERS
vendored
@ -1 +0,0 @@
|
||||
workflows/e2e_test.yml @tiagodread @shanemcd @jakemcdermott
|
||||
49
.github/ISSUE_TEMPLATE.md
vendored
49
.github/ISSUE_TEMPLATE.md
vendored
@ -1,49 +0,0 @@
|
||||
<!---
|
||||
The Ansible community is highly committed to the security of our open source
|
||||
projects. Security concerns should be reported directly by email to
|
||||
security@ansible.com. For more information on the Ansible community's
|
||||
practices regarding responsible disclosure, see
|
||||
https://www.ansible.com/security
|
||||
-->
|
||||
|
||||
##### ISSUE TYPE
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Bug Report
|
||||
- Feature Idea
|
||||
- Documentation
|
||||
|
||||
##### COMPONENT NAME
|
||||
<!-- Pick the area of AWX for this issue, you can have multiple, delete the rest: -->
|
||||
- API
|
||||
- UI
|
||||
- Installer
|
||||
|
||||
##### SUMMARY
|
||||
<!-- Briefly describe the problem. -->
|
||||
|
||||
##### ENVIRONMENT
|
||||
* AWX version: X.Y.Z
|
||||
* AWX install method: operator, developer environment
|
||||
* AWX deployment target: openshift, kubernetes, minikube
|
||||
* Operating System:
|
||||
* Web Browser:
|
||||
|
||||
##### STEPS TO REPRODUCE
|
||||
|
||||
<!-- For new features, show how the feature would be used. For bugs, please show
|
||||
exactly how to reproduce the problem. Ideally, provide all steps and data needed
|
||||
to recreate the bug from a new awx install. -->
|
||||
|
||||
##### EXPECTED RESULTS
|
||||
|
||||
<!-- For bug reports, what did you expect to happen when running the steps
|
||||
above? -->
|
||||
|
||||
##### ACTUAL RESULTS
|
||||
|
||||
<!-- For bug reports, what actually happened? -->
|
||||
|
||||
##### ADDITIONAL INFORMATION
|
||||
|
||||
<!-- Include any links to sosreport, database dumps, screenshots or other
|
||||
information. -->
|
||||
32
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
32
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@ -1,32 +1,29 @@
|
||||
---
|
||||
name: Bug Report
|
||||
description: Create a report to help us improve
|
||||
labels:
|
||||
- bug
|
||||
description: "🐞 Create a report to help us improve"
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Issues are for **concrete, actionable bugs and feature requests** only. For debugging help or technical support, please use:
|
||||
- The #ansible-awx channel on irc.libera.chat
|
||||
- https://groups.google.com/forum/#!forum/awx-project
|
||||
Bug Report issues are for **concrete, actionable bugs** only.
|
||||
For debugging help or technical support, please see the [Get Involved section of our README](https://github.com/ansible/awx#get-involved)
|
||||
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Please confirm the following
|
||||
options:
|
||||
- label: I agree to follow this project's [code of conduct](http://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
|
||||
- label: I agree to follow this project's [code of conduct](https://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
|
||||
required: true
|
||||
- label: I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
|
||||
required: true
|
||||
- label: I understand that AWX is open source software provided for free and that I am not entitled to status updates or other assurances.
|
||||
- label: I understand that AWX is open source software provided for free and that I might not receive a timely response.
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: summary
|
||||
attributes:
|
||||
label: Summary
|
||||
label: Bug Summary
|
||||
description: Briefly describe the problem.
|
||||
validations:
|
||||
required: false
|
||||
@ -39,6 +36,18 @@ body:
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: checkboxes
|
||||
id: components
|
||||
attributes:
|
||||
label: Select the relevant components
|
||||
options:
|
||||
- label: UI
|
||||
- label: API
|
||||
- label: Docs
|
||||
- label: Collection
|
||||
- label: CLI
|
||||
- label: Other
|
||||
|
||||
- type: dropdown
|
||||
id: awx-install-method
|
||||
attributes:
|
||||
@ -50,9 +59,8 @@ body:
|
||||
- minikube
|
||||
- openshift
|
||||
- minishift
|
||||
- docker on linux
|
||||
- docker for mac
|
||||
- boot2docker
|
||||
- docker development environment
|
||||
- N/A
|
||||
validations:
|
||||
required: true
|
||||
|
||||
|
||||
12
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
12
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
@ -0,0 +1,12 @@
|
||||
---
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: For debugging help or technical support
|
||||
url: https://github.com/ansible/awx#get-involved
|
||||
about: For general debugging or technical support please see the Get Involved section of our readme.
|
||||
- name: 📝 Ansible Code of Conduct
|
||||
url: https://docs.ansible.com/ansible/latest/community/code_of_conduct.html?utm_medium=github&utm_source=issue_template_chooser
|
||||
about: AWX uses the Ansible Code of Conduct; ❤ Be nice to other members of the community. ☮ Behave.
|
||||
- name: 💼 For Enterprise
|
||||
url: https://www.ansible.com/products/engine?utm_medium=github&utm_source=issue_template_chooser
|
||||
about: Red Hat offers support for the Ansible Automation Platform
|
||||
17
.github/ISSUE_TEMPLATE/feature_request.md
vendored
17
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -1,17 +0,0 @@
|
||||
---
|
||||
name: "✨ Feature request"
|
||||
about: Suggest an idea for this project
|
||||
|
||||
---
|
||||
<!-- Issues are for **concrete, actionable bugs and feature requests** only - if you're just asking for debugging help or technical support, please use:
|
||||
|
||||
- http://web.libera.chat/?channels=#ansible-awx
|
||||
- https://groups.google.com/forum/#!forum/awx-project
|
||||
|
||||
We have to limit this because of limited volunteer time to respond to issues! -->
|
||||
|
||||
##### ISSUE TYPE
|
||||
- Feature Idea
|
||||
|
||||
##### SUMMARY
|
||||
<!-- Briefly describe the problem or desired enhancement. -->
|
||||
42
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
42
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
Normal file
@ -0,0 +1,42 @@
|
||||
---
|
||||
name: ✨ Feature request
|
||||
description: Suggest an idea for this project
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Feature Request issues are for **feature requests** only.
|
||||
For debugging help or technical support, please see the [Get Involved section of our README](https://github.com/ansible/awx#get-involved)
|
||||
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Please confirm the following
|
||||
options:
|
||||
- label: I agree to follow this project's [code of conduct](https://docs.ansible.com/ansible/latest/community/code_of_conduct.html).
|
||||
required: true
|
||||
- label: I have checked the [current issues](https://github.com/ansible/awx/issues) for duplicates.
|
||||
required: true
|
||||
- label: I understand that AWX is open source software provided for free and that I might not receive a timely response.
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
id: summary
|
||||
attributes:
|
||||
label: Feature Summary
|
||||
description: Briefly describe the desired enhancement.
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: checkboxes
|
||||
id: components
|
||||
attributes:
|
||||
label: Select the relevant components
|
||||
options:
|
||||
- label: UI
|
||||
- label: API
|
||||
- label: Docs
|
||||
- label: Collection
|
||||
- label: CLI
|
||||
- label: Other
|
||||
|
||||
9
.github/LABEL_MAP.md
vendored
9
.github/LABEL_MAP.md
vendored
@ -1,9 +0,0 @@
|
||||
Bug Report: type:bug
|
||||
Bugfix Pull Request: type:bug
|
||||
Feature Request: type:enhancement
|
||||
Feature Pull Request: type:enhancement
|
||||
UI: component:ui
|
||||
API: component:api
|
||||
Installer: component:installer
|
||||
Docs Pull Request: component:docs
|
||||
Documentation: component:docs
|
||||
18
.github/PULL_REQUEST_TEMPLATE.md
vendored
18
.github/PULL_REQUEST_TEMPLATE.md
vendored
@ -1,11 +1,3 @@
|
||||
<!--- changelog-entry
|
||||
# Fill in 'msg' below to have an entry automatically added to the next release changelog.
|
||||
# Leaving 'msg' blank will not generate a changelog entry for this PR.
|
||||
# Please ensure this is a simple (and readable) one-line string.
|
||||
---
|
||||
msg: ""
|
||||
-->
|
||||
|
||||
##### SUMMARY
|
||||
<!--- Describe the change, including rationale and design decisions -->
|
||||
|
||||
@ -17,14 +9,18 @@ the change does.
|
||||
|
||||
##### ISSUE TYPE
|
||||
<!--- Pick one below and delete the rest: -->
|
||||
- Feature Pull Request
|
||||
- Bugfix Pull Request
|
||||
- Docs Pull Request
|
||||
- Breaking Change
|
||||
- New or Enhanced Feature
|
||||
- Bug or Docs Fix
|
||||
|
||||
##### COMPONENT NAME
|
||||
<!--- Name of the module/plugin/module/task -->
|
||||
- API
|
||||
- UI
|
||||
- Collection
|
||||
- CLI
|
||||
- Docs
|
||||
- Other
|
||||
|
||||
##### AWX VERSION
|
||||
<!--- Paste verbatim output from `make VERSION` between quotes below -->
|
||||
|
||||
20
.github/dependabot.yml
vendored
Normal file
20
.github/dependabot.yml
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "npm"
|
||||
directory: "/awx/ui"
|
||||
schedule:
|
||||
interval: "monthly"
|
||||
open-pull-requests-limit: 5
|
||||
allow:
|
||||
- dependency-type: "production"
|
||||
reviewers:
|
||||
- "AlexSCorey"
|
||||
- "keithjgrant"
|
||||
- "kialam"
|
||||
- "mabashian"
|
||||
- "marshmalien"
|
||||
- "nixocio"
|
||||
labels:
|
||||
- "component:ui"
|
||||
- "dependencies"
|
||||
target-branch: "devel"
|
||||
16
.github/issue_labeler.yml
vendored
Normal file
16
.github/issue_labeler.yml
vendored
Normal file
@ -0,0 +1,16 @@
|
||||
needs_triage:
|
||||
- '.*'
|
||||
"type:bug":
|
||||
- "Bug Summary"
|
||||
"type:enhancement":
|
||||
- "Feature Summary"
|
||||
"component:ui":
|
||||
- "\\[X\\] UI"
|
||||
"component:api":
|
||||
- "\\[X\\] API"
|
||||
"component:docs":
|
||||
- "\\[X\\] Docs"
|
||||
"component:awx_collection":
|
||||
- "\\[X\\] Collection"
|
||||
"component:cli":
|
||||
- "\\[X\\] awxkit"
|
||||
19
.github/pr_labeler.yml
vendored
Normal file
19
.github/pr_labeler.yml
vendored
Normal file
@ -0,0 +1,19 @@
|
||||
"component:api":
|
||||
- any: ["awx/**/*", "!awx/ui/**"]
|
||||
|
||||
"component:ui":
|
||||
- any: ["awx/ui/**/*"]
|
||||
|
||||
"component:docs":
|
||||
- any: ["docs/**/*"]
|
||||
|
||||
"component:cli":
|
||||
- any: ["awxkit/**/*"]
|
||||
|
||||
"component:awx_collection":
|
||||
- any: ["awx_collection/**/*"]
|
||||
|
||||
"dependencies":
|
||||
- any: ["awx/ui/package.json"]
|
||||
- any: ["awx/requirements/*.txt"]
|
||||
- any: ["awx/requirements/requirements.in"]
|
||||
93
.github/triage_replies.md
vendored
Normal file
93
.github/triage_replies.md
vendored
Normal file
@ -0,0 +1,93 @@
|
||||
## General
|
||||
- For the roundup of all the different mailing lists available from AWX, Ansible, and beyond visit: https://docs.ansible.com/ansible/latest/community/communication.html
|
||||
- Hello, we think your question is answered in our FAQ. Does this: https://www.ansible.com/products/awx-project/faq cover your question?
|
||||
- You can find the latest documentation here: https://docs.ansible.com/automation-controller/latest/html/userguide/index.html
|
||||
|
||||
|
||||
|
||||
## PRs/Issues
|
||||
|
||||
### Visit our mailing list
|
||||
- Hello, this appears to be less of a bug report or feature request and more of a question. Could you please ask this on our mailing list? See https://github.com/ansible/awx/#get-involved for information for ways to connect with us.
|
||||
|
||||
### Denied Submission
|
||||
|
||||
- Hi! \
|
||||
\
|
||||
Thanks very much for your submission to AWX. It means a lot to us that you have taken time to contribute. \
|
||||
\
|
||||
At this time we do not want to merge this PR. Our reasons for this are: \
|
||||
\
|
||||
(A) INSERT ITEM HERE \
|
||||
\
|
||||
Please know that we are always up for discussion but this project is very active. Because of this, we're unlikely to see comments made on closed PRs, and we lock them after some time. If you or anyone else has any further questions, please let us know by using any of the communication methods listed in the page below: \
|
||||
\
|
||||
https://github.com/ansible/awx/#get-involved \
|
||||
\
|
||||
In the future, sometimes starting a discussion on the development list prior to implementing a feature can make getting things included a little easier, but it is not always necessary. \
|
||||
\
|
||||
Thank you once again for this and your interest in AWX!
|
||||
|
||||
|
||||
### No Progress
|
||||
- Hi! \
|
||||
\
|
||||
Thank you very much for your submission to AWX. It means a lot to us that you have taken time to contribute. \
|
||||
\
|
||||
On this PR, changes were requested but it has been some time since then. We think this PR has merit but without the requested changes we are unable to merge it. At this time we are closing you PR. If you get time to address the changes you are welcome to open another PR or we can reopen this PR upon request if you contact us by using any of the communication methods listed in the page below: \
|
||||
\
|
||||
https://github.com/ansible/awx/#get-involved \
|
||||
\
|
||||
Thank you once again for this and your interest in AWX!
|
||||
|
||||
|
||||
|
||||
|
||||
## Common
|
||||
|
||||
### Give us more info
|
||||
- Hello, we'd love to help, but we need a little more information about the problem you're having. Screenshots, log outputs, or any reproducers would be very helpful.
|
||||
|
||||
### Code of Conduct
|
||||
- Hello. Please keep in mind that Ansible adheres to a Code of Conduct in its community spaces. The spirit of the code of conduct is to be kind, and this is your friendly reminder to be so. Please see the full code of conduct here if you have questions: https://docs.ansible.com/ansible/latest/community/code_of_conduct.html
|
||||
|
||||
|
||||
|
||||
|
||||
## Mailing List Triage
|
||||
|
||||
### Create an issue
|
||||
- Hello, thanks for reaching out on list. We think this merits an issue on our Github, https://github.com/ansible/awx/issues. If you could open an issue up on Github it will get tagged and integrated into our planning and workflow. All future work will be tracked there. Issues should include as much information as possible, including screenshots, log outputs, or any reproducers.
|
||||
|
||||
### Create a Pull Request
|
||||
- Hello, we think your idea is good! Please consider contributing a PR for this following our contributing guidelines: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
|
||||
### Receptor
|
||||
- You can find the receptor docs here: https://receptor.readthedocs.io/en/latest/
|
||||
- Hello, your issue seems related to receptor. Could you please open an issue in the receptor repository? https://github.com/ansible/receptor. Thanks!
|
||||
|
||||
### Ansible Engine not AWX
|
||||
- Hello, your question seems to be about Ansible development, not about AWX. Try asking on the Ansible-devel specific mailing list: https://groups.google.com/g/ansible-devel
|
||||
- Hello, your question seems to be about using Ansible, not about AWX. https://groups.google.com/g/ansible-project is the best place to visit for user questions about Ansible. Thanks!
|
||||
|
||||
### Ansible Galaxy not AWX
|
||||
- Hey there. That sounds like an FAQ question. Did this: https://www.ansible.com/products/awx-project/faq cover your question?
|
||||
|
||||
### Contributing Guidelines
|
||||
- AWX: https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md
|
||||
- AWX-Operator: https://github.com/ansible/awx-operator/blob/devel/CONTRIBUTING.md
|
||||
|
||||
### AWX Release
|
||||
- Hi all, \
|
||||
\
|
||||
We're happy to announce that the next release of AWX, version <X> is now available! \
|
||||
In addition AWX Operator version <Y> has also been release! \
|
||||
\
|
||||
Please see the releases pages for more details: \
|
||||
AWX: https://github.com/ansible/awx/releases/tag/<X> \
|
||||
Operator: https://github.com/ansible/awx-operator/releases/tag/<Y> \
|
||||
\
|
||||
The AWX team.
|
||||
|
||||
## Try latest version
|
||||
- Hello, this issue pertains to an older version of AWX. Try upgrading to the latest version and let us know if that resolves your issue.
|
||||
184
.github/workflows/ci.yml
vendored
184
.github/workflows/ci.yml
vendored
@ -5,14 +5,51 @@ env:
|
||||
on:
|
||||
pull_request:
|
||||
jobs:
|
||||
api-test:
|
||||
common-tests:
|
||||
name: ${{ matrix.tests.name }}
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
tests:
|
||||
- name: api-test
|
||||
command: /start_tests.sh
|
||||
label: Run API Tests
|
||||
- name: api-lint
|
||||
command: /var/lib/awx/venv/awx/bin/tox -e linters
|
||||
label: Run API Linters
|
||||
- name: api-swagger
|
||||
command: /start_tests.sh swagger
|
||||
label: Generate API Reference
|
||||
- name: awx-collection
|
||||
command: /start_tests.sh test_collection_all
|
||||
label: Run Collection Tests
|
||||
- name: api-schema
|
||||
label: Check API Schema
|
||||
command: /start_tests.sh detect-schema-change SCHEMA_DIFF_BASE_BRANCH=${{ github.event.pull_request.base.ref }}
|
||||
- name: ui-lint
|
||||
label: Run UI Linters
|
||||
command: make ui-lint
|
||||
- name: ui-test-screens
|
||||
label: Run UI Screens Tests
|
||||
command: make ui-test-screens
|
||||
- name: ui-test-general
|
||||
label: Run UI General Tests
|
||||
command: make ui-test-general
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
@ -25,18 +62,23 @@ jobs:
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Run API Tests
|
||||
- name: ${{ matrix.texts.label }}
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} /start_tests.sh
|
||||
api-lint:
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} ${{ matrix.tests.command }}
|
||||
dev-env:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
@ -49,130 +91,12 @@ jobs:
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Run API Linters
|
||||
- name: Run smoke test
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} /var/lib/awx/venv/awx/bin/tox -e linters
|
||||
api-swagger:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
export DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }}
|
||||
export COMPOSE_TAG=${{ env.BRANCH }}
|
||||
ansible-playbook tools/docker-compose/ansible/smoke-test.yml -e repo_dir=$(pwd) -v
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Generate API Reference
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} /start_tests.sh swagger
|
||||
awx-collection:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Run Collection Tests
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} /start_tests.sh test_collection_all
|
||||
api-schema:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Check API Schema
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} /start_tests.sh detect-schema-change
|
||||
ui-lint:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Run UI Linters
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} make ui-lint
|
||||
ui-test:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} || :
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${{ env.BRANCH }} make docker-compose-build
|
||||
|
||||
- name: Run UI Tests
|
||||
run: |
|
||||
docker run -u $(id -u) --rm -v ${{ github.workspace}}:/awx_devel/:Z \
|
||||
--workdir=/awx_devel ghcr.io/${{ github.repository_owner }}/awx_devel:${{ env.BRANCH }} make ui-test
|
||||
awx-operator:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
@ -207,7 +131,7 @@ jobs:
|
||||
ansible-galaxy collection install -r molecule/requirements.yml
|
||||
sudo rm -f $(which kustomize)
|
||||
make kustomize
|
||||
KUSTOMIZE_PATH=$(readlink -f bin/kustomize) molecule test -s kind
|
||||
KUSTOMIZE_PATH=$(readlink -f bin/kustomize) molecule -v test -s kind
|
||||
env:
|
||||
AWX_TEST_IMAGE: awx
|
||||
AWX_TEST_VERSION: ci
|
||||
|
||||
30
.github/workflows/devel_image.yml
vendored
30
.github/workflows/devel_image.yml
vendored
@ -1,30 +0,0 @@
|
||||
---
|
||||
name: Push Development Image
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- devel
|
||||
jobs:
|
||||
push:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${GITHUB_REF##*/}
|
||||
|
||||
- name: Build image
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${GITHUB_REF##*/} make docker-compose-build
|
||||
|
||||
- name: Push image
|
||||
run: |
|
||||
docker push ghcr.io/${{ github.repository_owner }}/awx_devel:${GITHUB_REF##*/}
|
||||
43
.github/workflows/devel_images.yml
vendored
Normal file
43
.github/workflows/devel_images.yml
vendored
Normal file
@ -0,0 +1,43 @@
|
||||
---
|
||||
name: Build/Push Development Images
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- devel
|
||||
- release_*
|
||||
jobs:
|
||||
push:
|
||||
if: endsWith(github.repository, '/awx') || startsWith(github.ref, 'refs/heads/release_')
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
packages: write
|
||||
contents: read
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
- name: Pre-pull image to warm build cache
|
||||
run: |
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_devel:${GITHUB_REF##*/} || :
|
||||
docker pull ghcr.io/${{ github.repository_owner }}/awx_kube_devel:${GITHUB_REF##*/} || :
|
||||
|
||||
- name: Build images
|
||||
run: |
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${GITHUB_REF##*/} make docker-compose-build
|
||||
DEV_DOCKER_TAG_BASE=ghcr.io/${{ github.repository_owner }} COMPOSE_TAG=${GITHUB_REF##*/} make awx-kube-dev-build
|
||||
|
||||
- name: Push image
|
||||
run: |
|
||||
docker push ghcr.io/${{ github.repository_owner }}/awx_devel:${GITHUB_REF##*/}
|
||||
docker push ghcr.io/${{ github.repository_owner }}/awx_kube_devel:${GITHUB_REF##*/}
|
||||
8
.github/workflows/e2e_test.yml
vendored
8
.github/workflows/e2e_test.yml
vendored
@ -18,6 +18,14 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Install system deps
|
||||
run: sudo apt-get install -y gettext
|
||||
|
||||
|
||||
21
.github/workflows/label_issue.yml
vendored
Normal file
21
.github/workflows/label_issue.yml
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
name: Label Issue
|
||||
|
||||
on:
|
||||
issues:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
|
||||
jobs:
|
||||
triage:
|
||||
runs-on: ubuntu-latest
|
||||
name: Label Issue
|
||||
|
||||
steps:
|
||||
- name: Label Issue
|
||||
uses: github/issue-labeler@v2.4.1
|
||||
with:
|
||||
repo-token: "${{ secrets.GITHUB_TOKEN }}"
|
||||
not-before: 2021-12-07T07:00:00Z
|
||||
configuration-path: .github/issue_labeler.yml
|
||||
enable-versioned-regex: 0
|
||||
20
.github/workflows/label_pr.yml
vendored
Normal file
20
.github/workflows/label_pr.yml
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
name: Label PR
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
|
||||
jobs:
|
||||
triage:
|
||||
runs-on: ubuntu-latest
|
||||
name: Label PR
|
||||
|
||||
steps:
|
||||
- name: Label PR
|
||||
uses: actions/labeler@v3
|
||||
with:
|
||||
repo-token: "${{ secrets.GITHUB_TOKEN }}"
|
||||
configuration-path: .github/pr_labeler.yml
|
||||
47
.github/workflows/promote.yml
vendored
47
.github/workflows/promote.yml
vendored
@ -8,6 +8,53 @@ jobs:
|
||||
promote:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout awx
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python${{ env.py_version }} -m pip install wheel twine
|
||||
|
||||
- name: Set official collection namespace
|
||||
run: echo collection_namespace=awx >> $GITHUB_ENV
|
||||
if: ${{ github.repository_owner == 'ansible' }}
|
||||
|
||||
- name: Set unofficial collection namespace
|
||||
run: echo collection_namespace=${{ github.repository_owner }} >> $GITHUB_ENV
|
||||
if: ${{ github.repository_owner != 'ansible' }}
|
||||
|
||||
- name: Build collection and publish to galaxy
|
||||
run: |
|
||||
COLLECTION_TEMPLATE_VERSION=true COLLECTION_NAMESPACE=${{ env.collection_namespace }} make build_collection
|
||||
ansible-galaxy collection publish \
|
||||
--token=${{ secrets.GALAXY_TOKEN }} \
|
||||
awx_collection_build/${{ env.collection_namespace }}-awx-${{ github.event.release.tag_name }}.tar.gz
|
||||
|
||||
- name: Set official pypi info
|
||||
run: echo pypi_repo=pypi >> $GITHUB_ENV
|
||||
if: ${{ github.repository_owner == 'ansible' }}
|
||||
|
||||
- name: Set unofficial pypi info
|
||||
run: echo pypi_repo=testpypi >> $GITHUB_ENV
|
||||
if: ${{ github.repository_owner != 'ansible' }}
|
||||
|
||||
- name: Build awxkit and upload to pypi
|
||||
run: |
|
||||
cd awxkit && python3 setup.py bdist_wheel
|
||||
twine upload \
|
||||
-r ${{ env.pypi_repo }} \
|
||||
-u ${{ secrets.PYPI_USERNAME }} \
|
||||
-p ${{ secrets.PYPI_PASSWORD }} \
|
||||
dist/*
|
||||
|
||||
- name: Log in to GHCR
|
||||
run: |
|
||||
echo ${{ secrets.GITHUB_TOKEN }} | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
|
||||
24
.github/workflows/stage.yml
vendored
24
.github/workflows/stage.yml
vendored
@ -43,6 +43,14 @@ jobs:
|
||||
with:
|
||||
path: awx
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Checkout awx-logos
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
@ -75,7 +83,8 @@ jobs:
|
||||
- name: Build and stage awx-operator
|
||||
working-directory: awx-operator
|
||||
run: |
|
||||
BUILD_ARGS="--build-arg DEFAULT_AWX_VERSION=${{ github.event.inputs.version }}" \
|
||||
BUILD_ARGS="--build-arg DEFAULT_AWX_VERSION=${{ github.event.inputs.version }} \
|
||||
--build-arg OPERATOR_VERSION=${{ github.event.inputs.operator_version }}" \
|
||||
IMAGE_TAG_BASE=ghcr.io/${{ github.repository_owner }}/awx-operator \
|
||||
VERSION=${{ github.event.inputs.operator_version }} make docker-build docker-push
|
||||
|
||||
@ -91,23 +100,10 @@ jobs:
|
||||
AWX_TEST_IMAGE: ${{ github.repository }}
|
||||
AWX_TEST_VERSION: ${{ github.event.inputs.version }}
|
||||
|
||||
- name: Generate changelog
|
||||
uses: shanemcd/simple-changelog-generator@v1
|
||||
id: changelog
|
||||
with:
|
||||
repo: "${{ github.repository }}"
|
||||
|
||||
- name: Write changelog to file
|
||||
run: |
|
||||
cat << 'EOF' > /tmp/awx-changelog
|
||||
${{ steps.changelog.outputs.changelog }}
|
||||
EOF
|
||||
|
||||
- name: Create draft release for AWX
|
||||
working-directory: awx
|
||||
run: |
|
||||
ansible-playbook -v tools/ansible/stage.yml \
|
||||
-e changelog_path=/tmp/awx-changelog \
|
||||
-e repo=${{ github.repository }} \
|
||||
-e awx_image=ghcr.io/${{ github.repository }} \
|
||||
-e version=${{ github.event.inputs.version }} \
|
||||
|
||||
11
.github/workflows/upload_schema.yml
vendored
11
.github/workflows/upload_schema.yml
vendored
@ -4,6 +4,7 @@ on:
|
||||
push:
|
||||
branches:
|
||||
- devel
|
||||
- release_**
|
||||
jobs:
|
||||
push:
|
||||
runs-on: ubuntu-latest
|
||||
@ -13,6 +14,14 @@ jobs:
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Get python version from Makefile
|
||||
run: echo py_version=`make PYTHON_VERSION` >> $GITHUB_ENV
|
||||
|
||||
- name: Install python ${{ env.py_version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ env.py_version }}
|
||||
|
||||
- name: Log in to registry
|
||||
run: |
|
||||
echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin
|
||||
@ -38,6 +47,6 @@ jobs:
|
||||
run: |
|
||||
ansible localhost -c local, -m command -a "{{ ansible_python_interpreter + ' -m pip install boto3'}}"
|
||||
ansible localhost -c local -m aws_s3 \
|
||||
-a 'src=${{ github.workspace }}/schema.json bucket=awx-public-ci-files object=schema.json mode=put permission=public-read'
|
||||
-a "src=${{ github.workspace }}/schema.json bucket=awx-public-ci-files object=${GITHUB_REF##*/}/schema.json mode=put permission=public-read"
|
||||
|
||||
|
||||
|
||||
6
.gitignore
vendored
6
.gitignore
vendored
@ -7,6 +7,9 @@ reference-schema.json
|
||||
.tags
|
||||
.tags1
|
||||
|
||||
# User level pre-commit hooks
|
||||
pre-commit-user
|
||||
|
||||
# Tower
|
||||
awx-dev
|
||||
awx/settings/local_*.py*
|
||||
@ -35,13 +38,14 @@ awx/ui/build
|
||||
awx/ui/.env.local
|
||||
awx/ui/instrumented
|
||||
rsyslog.pid
|
||||
tools/prometheus/data
|
||||
tools/prometheus
|
||||
tools/docker-compose/ansible/awx_dump.sql
|
||||
tools/docker-compose/Dockerfile
|
||||
tools/docker-compose/_build
|
||||
tools/docker-compose/_sources
|
||||
tools/docker-compose/overrides/
|
||||
tools/docker-compose-minikube/_sources
|
||||
tools/docker-compose/keycloak.awx.realm.json
|
||||
|
||||
# Tower setup playbook testing
|
||||
setup/test/roles/postgresql
|
||||
|
||||
@ -17,6 +17,7 @@ Have questions about this document or anything not covered here? Come chat with
|
||||
- [Building API Documentation](#building-api-documentation)
|
||||
- [Accessing the AWX web interface](#accessing-the-awx-web-interface)
|
||||
- [Purging containers and images](#purging-containers-and-images)
|
||||
- [Pre commit hooks](#pre-commit-hooks)
|
||||
- [What should I work on?](#what-should-i-work-on)
|
||||
- [Submitting Pull Requests](#submitting-pull-requests)
|
||||
- [PR Checks run by Zuul](#pr-checks-run-by-zuul)
|
||||
@ -104,6 +105,14 @@ When necessary, remove any AWX containers and images by running the following:
|
||||
(host)$ make docker-clean
|
||||
```
|
||||
|
||||
### Pre commit hooks
|
||||
|
||||
When you attempt to perform a `git commit` there will be a pre-commit hook that gets run before the commit is allowed to your local repository. For example, python's [black](https://pypi.org/project/black/) will be run to test the formatting of any python files.
|
||||
|
||||
While you can use environment variables to skip the pre-commit hooks GitHub will run similar tests and prevent merging of PRs if the tests do not pass.
|
||||
|
||||
If you would like to add additional commit hooks for your own usage you can create a directory in the root of the repository called `pre-commit-user`. Any executable file in that directory will be executed as part of the pre-commit hooks. If any of the pre-commit checks fail the commit will be halted. For your convenience in user scripts, a variable called `CHANGED_FILES` will be set with any changed files present in the commit.
|
||||
|
||||
## What should I work on?
|
||||
|
||||
For feature work, take a look at the current [Enhancements](https://github.com/ansible/awx/issues?q=is%3Aissue+is%3Aopen+label%3Atype%3Aenhancement).
|
||||
|
||||
93
Makefile
93
Makefile
@ -1,5 +1,4 @@
|
||||
PYTHON ?= python3.8
|
||||
PYTHON_VERSION = $(shell $(PYTHON) -c "from distutils.sysconfig import get_python_version; print(get_python_version())")
|
||||
PYTHON ?= python3.9
|
||||
OFFICIAL ?= no
|
||||
NODE ?= node
|
||||
NPM_BIN ?= npm
|
||||
@ -11,12 +10,17 @@ COLLECTION_VERSION := $(shell $(PYTHON) setup.py --version | cut -d . -f 1-3)
|
||||
|
||||
# NOTE: This defaults the container image version to the branch that's active
|
||||
COMPOSE_TAG ?= $(GIT_BRANCH)
|
||||
COMPOSE_HOST ?= $(shell hostname)
|
||||
MAIN_NODE_TYPE ?= hybrid
|
||||
# If set to true docker-compose will also start a keycloak instance
|
||||
KEYCLOAK ?= false
|
||||
# If set to true docker-compose will also start an ldap instance
|
||||
LDAP ?= false
|
||||
# If set to true docker-compose will also start a splunk instance
|
||||
SPLUNK ?= false
|
||||
|
||||
VENV_BASE ?= /var/lib/awx/venv
|
||||
|
||||
DEV_DOCKER_TAG_BASE ?= quay.io/awx
|
||||
DEV_DOCKER_TAG_BASE ?= ghcr.io/ansible
|
||||
DEVEL_IMAGE_NAME ?= $(DEV_DOCKER_TAG_BASE)/awx_devel:$(COMPOSE_TAG)
|
||||
|
||||
RECEPTOR_IMAGE ?= quay.io/ansible/receptor:devel
|
||||
@ -26,7 +30,7 @@ RECEPTOR_IMAGE ?= quay.io/ansible/receptor:devel
|
||||
SRC_ONLY_PKGS ?= cffi,pycparser,psycopg2,twilio
|
||||
# These should be upgraded in the AWX and Ansible venv before attempting
|
||||
# to install the actual requirements
|
||||
VENV_BOOTSTRAP ?= pip==21.2.4 setuptools==58.2.0 wheel==0.36.2
|
||||
VENV_BOOTSTRAP ?= pip==21.2.4 setuptools==58.2.0 setuptools_scm[toml]==6.4.2 wheel==0.36.2
|
||||
|
||||
NAME ?= awx
|
||||
|
||||
@ -43,7 +47,7 @@ I18N_FLAG_FILE = .i18n_built
|
||||
receiver test test_unit test_coverage coverage_html \
|
||||
dev_build release_build sdist \
|
||||
ui-release ui-devel \
|
||||
VERSION docker-compose-sources \
|
||||
VERSION PYTHON_VERSION docker-compose-sources \
|
||||
.git/hooks/pre-commit
|
||||
|
||||
clean-tmp:
|
||||
@ -145,24 +149,6 @@ version_file:
|
||||
fi; \
|
||||
$(PYTHON) -c "import awx; print(awx.__version__)" > /var/lib/awx/.awx_version; \
|
||||
|
||||
# Do any one-time init tasks.
|
||||
comma := ,
|
||||
init:
|
||||
if [ "$(VENV_BASE)" ]; then \
|
||||
. $(VENV_BASE)/awx/bin/activate; \
|
||||
fi; \
|
||||
$(MANAGEMENT_COMMAND) provision_instance --hostname=$(COMPOSE_HOST) --node_type=$(MAIN_NODE_TYPE); \
|
||||
$(MANAGEMENT_COMMAND) register_queue --queuename=controlplane --instance_percent=100;\
|
||||
$(MANAGEMENT_COMMAND) register_queue --queuename=default --instance_percent=100;
|
||||
if [ ! -f /etc/receptor/certs/awx.key ]; then \
|
||||
rm -f /etc/receptor/certs/*; \
|
||||
receptor --cert-init commonname="AWX Test CA" bits=2048 outcert=/etc/receptor/certs/ca.crt outkey=/etc/receptor/certs/ca.key; \
|
||||
for node in $(RECEPTOR_MUTUAL_TLS); do \
|
||||
receptor --cert-makereq bits=2048 commonname="$$node test cert" dnsname=$$node nodeid=$$node outreq=/etc/receptor/certs/$$node.csr outkey=/etc/receptor/certs/$$node.key; \
|
||||
receptor --cert-signreq req=/etc/receptor/certs/$$node.csr cacert=/etc/receptor/certs/ca.crt cakey=/etc/receptor/certs/ca.key outcert=/etc/receptor/certs/$$node.crt verify=yes; \
|
||||
done; \
|
||||
fi
|
||||
|
||||
# Refresh development environment after pulling new code.
|
||||
refresh: clean requirements_dev version_file develop migrate
|
||||
|
||||
@ -193,7 +179,7 @@ collectstatic:
|
||||
fi; \
|
||||
mkdir -p awx/public/static && $(PYTHON) manage.py collectstatic --clear --noinput > /dev/null 2>&1
|
||||
|
||||
UWSGI_DEV_RELOAD_COMMAND ?= supervisorctl restart tower-processes:awx-dispatcher tower-processes:awx-receiver
|
||||
DEV_RELOAD_COMMAND ?= supervisorctl restart tower-processes:*
|
||||
|
||||
uwsgi: collectstatic
|
||||
@if [ "$(VENV_BASE)" ]; then \
|
||||
@ -208,12 +194,13 @@ uwsgi: collectstatic
|
||||
--processes=5 \
|
||||
--harakiri=120 --master \
|
||||
--no-orphans \
|
||||
--py-autoreload 1 \
|
||||
--max-requests=1000 \
|
||||
--stats /tmp/stats.socket \
|
||||
--lazy-apps \
|
||||
--logformat "%(addr) %(method) %(uri) - %(proto) %(status)" \
|
||||
--hook-accepting1="exec: $(UWSGI_DEV_RELOAD_COMMAND)"
|
||||
--logformat "%(addr) %(method) %(uri) - %(proto) %(status)"
|
||||
|
||||
awx-autoreload:
|
||||
@/awx_devel/tools/docker-compose/awx-autoreload /awx_devel "$(DEV_RELOAD_COMMAND)"
|
||||
|
||||
daphne:
|
||||
@if [ "$(VENV_BASE)" ]; then \
|
||||
@ -283,16 +270,16 @@ api-lint:
|
||||
|
||||
awx-link:
|
||||
[ -d "/awx_devel/awx.egg-info" ] || $(PYTHON) /awx_devel/setup.py egg_info_dev
|
||||
cp -f /tmp/awx.egg-link /var/lib/awx/venv/awx/lib/python$(PYTHON_VERSION)/site-packages/awx.egg-link
|
||||
cp -f /tmp/awx.egg-link /var/lib/awx/venv/awx/lib/$(PYTHON)/site-packages/awx.egg-link
|
||||
|
||||
TEST_DIRS ?= awx/main/tests/unit awx/main/tests/functional awx/conf/tests awx/sso/tests
|
||||
|
||||
PYTEST_ARGS ?= -n auto
|
||||
# Run all API unit tests.
|
||||
test:
|
||||
if [ "$(VENV_BASE)" ]; then \
|
||||
. $(VENV_BASE)/awx/bin/activate; \
|
||||
fi; \
|
||||
PYTHONDONTWRITEBYTECODE=1 py.test -p no:cacheprovider -n auto $(TEST_DIRS)
|
||||
PYTHONDONTWRITEBYTECODE=1 py.test -p no:cacheprovider $(PYTEST_ARGS) $(TEST_DIRS)
|
||||
cd awxkit && $(VENV_BASE)/awx/bin/tox -re py3
|
||||
awx-manage check_migrations --dry-run --check -n 'missing_migration_file'
|
||||
|
||||
@ -301,6 +288,7 @@ COLLECTION_TEST_TARGET ?=
|
||||
COLLECTION_PACKAGE ?= awx
|
||||
COLLECTION_NAMESPACE ?= awx
|
||||
COLLECTION_INSTALL = ~/.ansible/collections/ansible_collections/$(COLLECTION_NAMESPACE)/$(COLLECTION_PACKAGE)
|
||||
COLLECTION_TEMPLATE_VERSION ?= false
|
||||
|
||||
test_collection:
|
||||
rm -f $(shell ls -d $(VENV_BASE)/awx/lib/python* | head -n 1)/no-global-site-packages.txt
|
||||
@ -323,14 +311,16 @@ symlink_collection:
|
||||
mkdir -p ~/.ansible/collections/ansible_collections/$(COLLECTION_NAMESPACE) # in case it does not exist
|
||||
ln -s $(shell pwd)/awx_collection $(COLLECTION_INSTALL)
|
||||
|
||||
build_collection:
|
||||
awx_collection_build: $(shell find awx_collection -type f)
|
||||
ansible-playbook -i localhost, awx_collection/tools/template_galaxy.yml \
|
||||
-e collection_package=$(COLLECTION_PACKAGE) \
|
||||
-e collection_namespace=$(COLLECTION_NAMESPACE) \
|
||||
-e collection_version=$(COLLECTION_VERSION) \
|
||||
-e '{"awx_template_version":false}'
|
||||
-e '{"awx_template_version": $(COLLECTION_TEMPLATE_VERSION)}'
|
||||
ansible-galaxy collection build awx_collection_build --force --output-path=awx_collection_build
|
||||
|
||||
build_collection: awx_collection_build
|
||||
|
||||
install_collection: build_collection
|
||||
rm -rf $(COLLECTION_INSTALL)
|
||||
ansible-galaxy collection install awx_collection_build/$(COLLECTION_NAMESPACE)-$(COLLECTION_PACKAGE)-$(COLLECTION_VERSION).tar.gz
|
||||
@ -384,7 +374,7 @@ clean-ui:
|
||||
rm -rf $(UI_BUILD_FLAG_FILE)
|
||||
|
||||
awx/ui/node_modules:
|
||||
NODE_OPTIONS=--max-old-space-size=4096 $(NPM_BIN) --prefix awx/ui --loglevel warn ci
|
||||
NODE_OPTIONS=--max-old-space-size=6144 $(NPM_BIN) --prefix awx/ui --loglevel warn ci
|
||||
|
||||
$(UI_BUILD_FLAG_FILE): awx/ui/node_modules
|
||||
$(PYTHON) tools/scripts/compilemessages.py
|
||||
@ -418,8 +408,17 @@ ui-lint:
|
||||
|
||||
ui-test:
|
||||
$(NPM_BIN) --prefix awx/ui install
|
||||
$(NPM_BIN) run --prefix awx/ui test
|
||||
$(NPM_BIN) run --prefix awx/ui test
|
||||
|
||||
ui-test-screens:
|
||||
$(NPM_BIN) --prefix awx/ui install
|
||||
$(NPM_BIN) run --prefix awx/ui pretest
|
||||
$(NPM_BIN) run --prefix awx/ui test-screens --runInBand
|
||||
|
||||
ui-test-general:
|
||||
$(NPM_BIN) --prefix awx/ui install
|
||||
$(NPM_BIN) run --prefix awx/ui pretest
|
||||
$(NPM_BIN) run --prefix awx/ui/ test-general --runInBand
|
||||
|
||||
# Build a pip-installable package into dist/ with a timestamped version number.
|
||||
dev_build:
|
||||
@ -468,7 +467,10 @@ docker-compose-sources: .git/hooks/pre-commit
|
||||
-e receptor_image=$(RECEPTOR_IMAGE) \
|
||||
-e control_plane_node_count=$(CONTROL_PLANE_NODE_COUNT) \
|
||||
-e execution_node_count=$(EXECUTION_NODE_COUNT) \
|
||||
-e minikube_container_group=$(MINIKUBE_CONTAINER_GROUP)
|
||||
-e minikube_container_group=$(MINIKUBE_CONTAINER_GROUP) \
|
||||
-e enable_keycloak=$(KEYCLOAK) \
|
||||
-e enable_ldap=$(LDAP) \
|
||||
-e enable_splunk=$(SPLUNK)
|
||||
|
||||
|
||||
docker-compose: awx/projects docker-compose-sources
|
||||
@ -487,8 +489,9 @@ docker-compose-runtest: awx/projects docker-compose-sources
|
||||
docker-compose-build-swagger: awx/projects docker-compose-sources
|
||||
docker-compose -f tools/docker-compose/_sources/docker-compose.yml run --rm --service-ports --no-deps awx_1 /start_tests.sh swagger
|
||||
|
||||
SCHEMA_DIFF_BASE_BRANCH ?= devel
|
||||
detect-schema-change: genschema
|
||||
curl https://s3.amazonaws.com/awx-public-ci-files/schema.json -o reference-schema.json
|
||||
curl https://s3.amazonaws.com/awx-public-ci-files/$(SCHEMA_DIFF_BASE_BRANCH)/schema.json -o reference-schema.json
|
||||
# Ignore differences in whitespace with -b
|
||||
diff -u -b reference-schema.json schema.json
|
||||
|
||||
@ -527,7 +530,12 @@ docker-compose-cluster-elk: awx/projects docker-compose-sources
|
||||
docker-compose -f tools/docker-compose/_sources/docker-compose.yml -f tools/elastic/docker-compose.logstash-link-cluster.yml -f tools/elastic/docker-compose.elastic-override.yml up --no-recreate
|
||||
|
||||
prometheus:
|
||||
docker run -u0 --net=tools_default --link=`docker ps | egrep -o "tools_awx(_run)?_([^ ]+)?"`:awxweb --volume `pwd`/tools/prometheus:/prometheus --name prometheus -d -p 0.0.0.0:9090:9090 prom/prometheus --web.enable-lifecycle --config.file=/prometheus/prometheus.yml
|
||||
docker volume create prometheus
|
||||
docker run -d --rm --net=_sources_default --link=awx_1:awx1 --volume prometheus-storage:/prometheus --volume `pwd`/tools/prometheus:/etc/prometheus --name prometheus -p 9090:9090 prom/prometheus
|
||||
|
||||
grafana:
|
||||
docker volume create grafana
|
||||
docker run -d --rm --net=_sources_default --volume grafana-storage:/var/lib/grafana --volume `pwd`/tools/grafana:/etc/grafana/provisioning --name grafana -p 3001:3000 grafana/grafana-enterprise
|
||||
|
||||
docker-compose-container-group:
|
||||
MINIKUBE_CONTAINER_GROUP=true make docker-compose
|
||||
@ -546,6 +554,9 @@ psql-container:
|
||||
VERSION:
|
||||
@echo "awx: $(VERSION)"
|
||||
|
||||
PYTHON_VERSION:
|
||||
@echo "$(PYTHON)" | sed 's:python::'
|
||||
|
||||
Dockerfile: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
|
||||
ansible-playbook tools/ansible/dockerfile.yml -e receptor_image=$(RECEPTOR_IMAGE)
|
||||
|
||||
@ -557,8 +568,9 @@ Dockerfile.kube-dev: tools/ansible/roles/dockerfile/templates/Dockerfile.j2
|
||||
-e receptor_image=$(RECEPTOR_IMAGE)
|
||||
|
||||
awx-kube-dev-build: Dockerfile.kube-dev
|
||||
docker build -f Dockerfile.kube-dev \
|
||||
DOCKER_BUILDKIT=1 docker build -f Dockerfile.kube-dev \
|
||||
--build-arg BUILDKIT_INLINE_CACHE=1 \
|
||||
--cache-from=$(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) \
|
||||
-t $(DEV_DOCKER_TAG_BASE)/awx_kube_devel:$(COMPOSE_TAG) .
|
||||
|
||||
|
||||
@ -580,3 +592,6 @@ messages:
|
||||
. $(VENV_BASE)/awx/bin/activate; \
|
||||
fi; \
|
||||
$(PYTHON) manage.py makemessages -l $(LANG) --keep-pot
|
||||
|
||||
print-%:
|
||||
@echo $($*)
|
||||
|
||||
@ -1,9 +1,3 @@
|
||||
---
|
||||
name: "\U0001F525 Security bug report"
|
||||
about: How to report security vulnerabilities
|
||||
|
||||
---
|
||||
|
||||
For all security related bugs, email security@ansible.com instead of using this issue tracker and you will receive a prompt response.
|
||||
|
||||
For more information on the Ansible community's practices regarding responsible disclosure, see https://www.ansible.com/security
|
||||
@ -36,7 +36,6 @@ else:
|
||||
from django.db.backends.utils import names_digest
|
||||
from django.db import connection
|
||||
|
||||
|
||||
if HAS_DJANGO is True:
|
||||
|
||||
# See upgrade blocker note in requirements/README.md
|
||||
@ -79,9 +78,10 @@ def oauth2_getattribute(self, attr):
|
||||
# Custom method to override
|
||||
# oauth2_provider.settings.OAuth2ProviderSettings.__getattribute__
|
||||
from django.conf import settings
|
||||
from oauth2_provider.settings import DEFAULTS
|
||||
|
||||
val = None
|
||||
if 'migrate' not in sys.argv:
|
||||
if (isinstance(attr, str)) and (attr in DEFAULTS) and (not attr.startswith('_')):
|
||||
# certain Django OAuth Toolkit migrations actually reference
|
||||
# setting lookups for references to model classes (e.g.,
|
||||
# oauth2_settings.REFRESH_TOKEN_MODEL)
|
||||
|
||||
@ -6,7 +6,7 @@ import logging
|
||||
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.encoding import smart_str
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework import authentication
|
||||
@ -24,7 +24,7 @@ class LoggedBasicAuthentication(authentication.BasicAuthentication):
|
||||
ret = super(LoggedBasicAuthentication, self).authenticate(request)
|
||||
if ret:
|
||||
username = ret[0].username if ret[0] else '<none>'
|
||||
logger.info(smart_text(u"User {} performed a {} to {} through the API".format(username, request.method, request.path)))
|
||||
logger.info(smart_str(u"User {} performed a {} to {} through the API".format(username, request.method, request.path)))
|
||||
return ret
|
||||
|
||||
def authenticate_header(self, request):
|
||||
@ -45,7 +45,7 @@ class LoggedOAuth2Authentication(OAuth2Authentication):
|
||||
user, token = ret
|
||||
username = user.username if user else '<none>'
|
||||
logger.info(
|
||||
smart_text(u"User {} performed a {} to {} through the API using OAuth 2 token {}.".format(username, request.method, request.path, token.pk))
|
||||
smart_str(u"User {} performed a {} to {} through the API using OAuth 2 token {}.".format(username, request.method, request.path, token.pk))
|
||||
)
|
||||
setattr(user, 'oauth_scopes', [x for x in token.scope.split() if x])
|
||||
return ret
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework import serializers
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
# All Rights Reserved.
|
||||
|
||||
# Django
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import ValidationError
|
||||
@ -13,7 +13,7 @@ class ActiveJobConflict(ValidationError):
|
||||
|
||||
def __init__(self, active_jobs):
|
||||
# During APIException.__init__(), Django Rest Framework
|
||||
# turn everything in self.detail into string by using force_text.
|
||||
# turn everything in self.detail into string by using force_str.
|
||||
# Declare detail afterwards circumvent this behavior.
|
||||
super(ActiveJobConflict, self).__init__()
|
||||
self.detail = {"error": _("Resource is being used by running jobs."), "active_jobs": active_jobs}
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
# All Rights Reserved.
|
||||
|
||||
# Django
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
|
||||
# Django REST Framework
|
||||
@ -28,13 +28,17 @@ class NullFieldMixin(object):
|
||||
return (is_empty_value, data)
|
||||
|
||||
|
||||
class BooleanNullField(NullFieldMixin, serializers.NullBooleanField):
|
||||
class BooleanNullField(NullFieldMixin, serializers.BooleanField):
|
||||
"""
|
||||
Custom boolean field that allows null and empty string as False values.
|
||||
"""
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs['allow_null'] = True
|
||||
super().__init__(**kwargs)
|
||||
|
||||
def to_internal_value(self, data):
|
||||
return bool(super(BooleanNullField, self).to_internal_value(data))
|
||||
return bool(super().to_internal_value(data))
|
||||
|
||||
|
||||
class CharNullField(NullFieldMixin, serializers.CharField):
|
||||
@ -47,7 +51,7 @@ class CharNullField(NullFieldMixin, serializers.CharField):
|
||||
super(CharNullField, self).__init__(**kwargs)
|
||||
|
||||
def to_internal_value(self, data):
|
||||
return super(CharNullField, self).to_internal_value(data or u'')
|
||||
return super(CharNullField, self).to_internal_value(data or '')
|
||||
|
||||
|
||||
class ChoiceNullField(NullFieldMixin, serializers.ChoiceField):
|
||||
@ -60,7 +64,7 @@ class ChoiceNullField(NullFieldMixin, serializers.ChoiceField):
|
||||
super(ChoiceNullField, self).__init__(**kwargs)
|
||||
|
||||
def to_internal_value(self, data):
|
||||
return super(ChoiceNullField, self).to_internal_value(data or u'')
|
||||
return super(ChoiceNullField, self).to_internal_value(data or '')
|
||||
|
||||
|
||||
class VerbatimField(serializers.Field):
|
||||
|
||||
@ -7,15 +7,15 @@ import json
|
||||
from functools import reduce
|
||||
|
||||
# Django
|
||||
from django.core.exceptions import FieldError, ValidationError
|
||||
from django.core.exceptions import FieldError, ValidationError, FieldDoesNotExist
|
||||
from django.db import models
|
||||
from django.db.models import Q, CharField, IntegerField, BooleanField
|
||||
from django.db.models.fields import FieldDoesNotExist
|
||||
from django.db.models import Q, CharField, IntegerField, BooleanField, TextField, JSONField
|
||||
from django.db.models.fields.related import ForeignObjectRel, ManyToManyField, ForeignKey
|
||||
from django.db.models.functions import Cast
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
from django.utils.encoding import force_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import force_str
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import ParseError, PermissionDenied
|
||||
@ -185,16 +185,14 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
return (field_list[-1], new_lookup)
|
||||
|
||||
def to_python_related(self, value):
|
||||
value = force_text(value)
|
||||
value = force_str(value)
|
||||
if value.lower() in ('none', 'null'):
|
||||
return None
|
||||
else:
|
||||
return int(value)
|
||||
|
||||
def value_to_python_for_field(self, field, value):
|
||||
if isinstance(field, models.NullBooleanField):
|
||||
return to_python_boolean(value, allow_none=True)
|
||||
elif isinstance(field, models.BooleanField):
|
||||
if isinstance(field, models.BooleanField):
|
||||
return to_python_boolean(value)
|
||||
elif isinstance(field, (ForeignObjectRel, ManyToManyField, GenericForeignKey, ForeignKey)):
|
||||
try:
|
||||
@ -244,6 +242,8 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
new_lookups.append('{}__{}__icontains'.format(new_lookup[:-8], rm_field.name))
|
||||
return value, new_lookups, needs_distinct
|
||||
else:
|
||||
if isinstance(field, JSONField):
|
||||
new_lookup = new_lookup.replace(field.name, f'{field.name}_as_txt')
|
||||
value = self.value_to_python_for_field(field, value)
|
||||
return value, new_lookup, needs_distinct
|
||||
|
||||
@ -293,7 +293,7 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
search_filter_relation = 'AND'
|
||||
values = reduce(lambda list1, list2: list1 + list2, [i.split(',') for i in values])
|
||||
for value in values:
|
||||
search_value, new_keys, _ = self.value_to_python(queryset.model, key, force_text(value))
|
||||
search_value, new_keys, _ = self.value_to_python(queryset.model, key, force_str(value))
|
||||
assert isinstance(new_keys, list)
|
||||
search_filters[search_value] = new_keys
|
||||
# by definition, search *only* joins across relations,
|
||||
@ -325,6 +325,9 @@ class FieldLookupBackend(BaseFilterBackend):
|
||||
value, new_key, distinct = self.value_to_python(queryset.model, key, value)
|
||||
if distinct:
|
||||
needs_distinct = True
|
||||
if '_as_txt' in new_key:
|
||||
fname = next(item for item in new_key.split('__') if item.endswith('_as_txt'))
|
||||
queryset = queryset.annotate(**{fname: Cast(fname[:-7], output_field=TextField())})
|
||||
if q_chain:
|
||||
chain_filters.append((q_not, new_key, value))
|
||||
elif q_or:
|
||||
@ -395,11 +398,11 @@ class OrderByBackend(BaseFilterBackend):
|
||||
order_by = value.split(',')
|
||||
else:
|
||||
order_by = (value,)
|
||||
if order_by is None:
|
||||
order_by = self.get_default_ordering(view)
|
||||
default_order_by = self.get_default_ordering(view)
|
||||
# glue the order by and default order by together so that the default is the backup option
|
||||
order_by = list(order_by or []) + list(default_order_by or [])
|
||||
if order_by:
|
||||
order_by = self._validate_ordering_fields(queryset.model, order_by)
|
||||
|
||||
# Special handling of the type field for ordering. In this
|
||||
# case, we're not sorting exactly on the type field, but
|
||||
# given the limited number of views with multiple types,
|
||||
|
||||
@ -10,18 +10,18 @@ import urllib.parse
|
||||
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import views as auth_views
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.cache import cache
|
||||
from django.core.exceptions import FieldDoesNotExist
|
||||
from django.db import connection
|
||||
from django.db.models.fields import FieldDoesNotExist
|
||||
from django.db.models.fields.related import OneToOneRel
|
||||
from django.http import QueryDict
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.encoding import smart_str
|
||||
from django.utils.safestring import mark_safe
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.contrib.auth import views as auth_views
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import PermissionDenied, AuthenticationFailed, ParseError, NotAcceptable, UnsupportedMediaType
|
||||
@ -44,6 +44,7 @@ from awx.main.views import ApiErrorView
|
||||
from awx.api.serializers import ResourceAccessListElementSerializer, CopySerializer, UserSerializer
|
||||
from awx.api.versioning import URLPathVersioning
|
||||
from awx.api.metadata import SublistAttachDetatchMetadata, Metadata
|
||||
from awx.conf import settings_registry
|
||||
|
||||
__all__ = [
|
||||
'APIView',
|
||||
@ -92,17 +93,18 @@ class LoggedLoginView(auth_views.LoginView):
|
||||
ret = super(LoggedLoginView, self).post(request, *args, **kwargs)
|
||||
current_user = getattr(request, 'user', None)
|
||||
if request.user.is_authenticated:
|
||||
logger.info(smart_text(u"User {} logged in from {}".format(self.request.user.username, request.META.get('REMOTE_ADDR', None))))
|
||||
logger.info(smart_str(u"User {} logged in from {}".format(self.request.user.username, request.META.get('REMOTE_ADDR', None))))
|
||||
ret.set_cookie('userLoggedIn', 'true')
|
||||
current_user = UserSerializer(self.request.user)
|
||||
current_user = smart_text(JSONRenderer().render(current_user.data))
|
||||
current_user = smart_str(JSONRenderer().render(current_user.data))
|
||||
current_user = urllib.parse.quote('%s' % current_user, '')
|
||||
ret.set_cookie('current_user', current_user, secure=settings.SESSION_COOKIE_SECURE or None)
|
||||
ret.setdefault('X-API-Session-Cookie-Name', getattr(settings, 'SESSION_COOKIE_NAME', 'awx_sessionid'))
|
||||
|
||||
return ret
|
||||
else:
|
||||
if 'username' in self.request.POST:
|
||||
logger.warn(smart_text(u"Login failed for user {} from {}".format(self.request.POST.get('username'), request.META.get('REMOTE_ADDR', None))))
|
||||
logger.warning(smart_str(u"Login failed for user {} from {}".format(self.request.POST.get('username'), request.META.get('REMOTE_ADDR', None))))
|
||||
ret.status_code = 401
|
||||
return ret
|
||||
|
||||
@ -208,12 +210,27 @@ class APIView(views.APIView):
|
||||
return response
|
||||
|
||||
if response.status_code >= 400:
|
||||
status_msg = "status %s received by user %s attempting to access %s from %s" % (
|
||||
response.status_code,
|
||||
request.user,
|
||||
request.path,
|
||||
request.META.get('REMOTE_ADDR', None),
|
||||
)
|
||||
msg_data = {
|
||||
'status_code': response.status_code,
|
||||
'user_name': request.user,
|
||||
'url_path': request.path,
|
||||
'remote_addr': request.META.get('REMOTE_ADDR', None),
|
||||
}
|
||||
|
||||
if type(response.data) is dict:
|
||||
msg_data['error'] = response.data.get('error', response.status_text)
|
||||
elif type(response.data) is list:
|
||||
msg_data['error'] = ", ".join(list(map(lambda x: x.get('error', response.status_text), response.data)))
|
||||
else:
|
||||
msg_data['error'] = response.status_text
|
||||
|
||||
try:
|
||||
status_msg = getattr(settings, 'API_400_ERROR_LOG_FORMAT').format(**msg_data)
|
||||
except Exception as e:
|
||||
if getattr(settings, 'API_400_ERROR_LOG_FORMAT', None):
|
||||
logger.error("Unable to format API_400_ERROR_LOG_FORMAT setting, defaulting log message: {}".format(e))
|
||||
status_msg = settings_registry.get_setting_field('API_400_ERROR_LOG_FORMAT').get_default().format(**msg_data)
|
||||
|
||||
if hasattr(self, '__init_request_error__'):
|
||||
response = self.handle_exception(self.__init_request_error__)
|
||||
if response.status_code == 401:
|
||||
@ -221,6 +238,7 @@ class APIView(views.APIView):
|
||||
logger.info(status_msg)
|
||||
else:
|
||||
logger.warning(status_msg)
|
||||
|
||||
response = super(APIView, self).finalize_response(request, response, *args, **kwargs)
|
||||
time_started = getattr(self, 'time_started', None)
|
||||
response['X-API-Product-Version'] = get_awx_version()
|
||||
@ -374,8 +392,8 @@ class GenericAPIView(generics.GenericAPIView, APIView):
|
||||
if hasattr(self.model._meta, "verbose_name"):
|
||||
d.update(
|
||||
{
|
||||
'model_verbose_name': smart_text(self.model._meta.verbose_name),
|
||||
'model_verbose_name_plural': smart_text(self.model._meta.verbose_name_plural),
|
||||
'model_verbose_name': smart_str(self.model._meta.verbose_name),
|
||||
'model_verbose_name_plural': smart_str(self.model._meta.verbose_name_plural),
|
||||
}
|
||||
)
|
||||
serializer = self.get_serializer()
|
||||
@ -506,8 +524,8 @@ class SubListAPIView(ParentMixin, ListAPIView):
|
||||
d = super(SubListAPIView, self).get_description_context()
|
||||
d.update(
|
||||
{
|
||||
'parent_model_verbose_name': smart_text(self.parent_model._meta.verbose_name),
|
||||
'parent_model_verbose_name_plural': smart_text(self.parent_model._meta.verbose_name_plural),
|
||||
'parent_model_verbose_name': smart_str(self.parent_model._meta.verbose_name),
|
||||
'parent_model_verbose_name_plural': smart_str(self.parent_model._meta.verbose_name_plural),
|
||||
}
|
||||
)
|
||||
return d
|
||||
@ -620,6 +638,11 @@ class SubListCreateAttachDetachAPIView(SubListCreateAPIView):
|
||||
# attaching/detaching them from the parent.
|
||||
|
||||
def is_valid_relation(self, parent, sub, created=False):
|
||||
"Override in subclasses to do efficient validation of attaching"
|
||||
return None
|
||||
|
||||
def is_valid_removal(self, parent, sub):
|
||||
"Same as is_valid_relation but called on disassociation"
|
||||
return None
|
||||
|
||||
def get_description_context(self):
|
||||
@ -704,6 +727,11 @@ class SubListCreateAttachDetachAPIView(SubListCreateAPIView):
|
||||
if not request.user.can_access(self.parent_model, 'unattach', parent, sub, self.relationship, request.data):
|
||||
raise PermissionDenied()
|
||||
|
||||
# Verify that removing the relationship is valid.
|
||||
unattach_errors = self.is_valid_removal(parent, sub)
|
||||
if unattach_errors is not None:
|
||||
return Response(unattach_errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
if parent_key:
|
||||
sub.delete()
|
||||
else:
|
||||
@ -817,7 +845,7 @@ class ResourceAccessList(ParentMixin, ListAPIView):
|
||||
|
||||
|
||||
def trigger_delayed_deep_copy(*args, **kwargs):
|
||||
from awx.main.tasks import deep_copy_model_obj
|
||||
from awx.main.tasks.system import deep_copy_model_obj
|
||||
|
||||
connection.on_commit(lambda: deep_copy_model_obj.delay(*args, **kwargs))
|
||||
|
||||
|
||||
@ -6,11 +6,12 @@ from uuid import UUID
|
||||
|
||||
# Django
|
||||
from django.core.exceptions import PermissionDenied
|
||||
from django.db.models import JSONField
|
||||
from django.db.models.fields import PositiveIntegerField, BooleanField
|
||||
from django.db.models.fields.related import ForeignKey
|
||||
from django.http import Http404
|
||||
from django.utils.encoding import force_text, smart_text
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import force_str, smart_str
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework import exceptions
|
||||
@ -22,7 +23,7 @@ from rest_framework.request import clone_request
|
||||
|
||||
# AWX
|
||||
from awx.api.fields import ChoiceNullField
|
||||
from awx.main.fields import JSONField, ImplicitRoleField
|
||||
from awx.main.fields import ImplicitRoleField
|
||||
from awx.main.models import NotificationTemplate
|
||||
from awx.main.utils.execution_environments import get_default_pod_spec
|
||||
|
||||
@ -53,7 +54,7 @@ class Metadata(metadata.SimpleMetadata):
|
||||
for attr in text_attrs:
|
||||
value = getattr(field, attr, None)
|
||||
if value is not None and value != '':
|
||||
field_info[attr] = force_text(value, strings_only=True)
|
||||
field_info[attr] = force_str(value, strings_only=True)
|
||||
|
||||
placeholder = getattr(field, 'placeholder', serializers.empty)
|
||||
if placeholder is not serializers.empty:
|
||||
@ -77,7 +78,7 @@ class Metadata(metadata.SimpleMetadata):
|
||||
}
|
||||
if field.field_name in field_help_text:
|
||||
opts = serializer.Meta.model._meta.concrete_model._meta
|
||||
verbose_name = smart_text(opts.verbose_name)
|
||||
verbose_name = smart_str(opts.verbose_name)
|
||||
field_info['help_text'] = field_help_text[field.field_name].format(verbose_name)
|
||||
|
||||
if field.field_name == 'type':
|
||||
|
||||
@ -1,11 +1,11 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import MetricsView
|
||||
|
||||
|
||||
urls = [url(r'^$', MetricsView.as_view(), name='metrics_view')]
|
||||
urls = [re_path(r'^$', MetricsView.as_view(), name='metrics_view')]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -5,7 +5,7 @@ import json
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.utils.encoding import smart_str
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework import parsers
|
||||
|
||||
@ -4,8 +4,6 @@
|
||||
# Python
|
||||
import logging
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import MethodNotAllowed, PermissionDenied
|
||||
from rest_framework import permissions
|
||||
@ -243,20 +241,13 @@ class IsSystemAdminOrAuditor(permissions.BasePermission):
|
||||
"""
|
||||
|
||||
def has_permission(self, request, view):
|
||||
if not request.user:
|
||||
if not (request.user and request.user.is_authenticated):
|
||||
return False
|
||||
if request.method == 'GET':
|
||||
return request.user.is_superuser or request.user.is_system_auditor
|
||||
return request.user.is_superuser
|
||||
|
||||
|
||||
class InstanceGroupTowerPermission(ModelAccessPermission):
|
||||
def has_object_permission(self, request, view, obj):
|
||||
if request.method == 'DELETE' and obj.name in [settings.DEFAULT_EXECUTION_QUEUE_NAME, settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME]:
|
||||
return False
|
||||
return super(InstanceGroupTowerPermission, self).has_object_permission(request, view, obj)
|
||||
|
||||
|
||||
class WebhookKeyPermission(permissions.BasePermission):
|
||||
def has_object_permission(self, request, view, obj):
|
||||
return request.user.can_access(view.model, 'admin', obj, request.data)
|
||||
|
||||
@ -25,8 +25,8 @@ from django.contrib.auth.password_validation import validate_password as django_
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.exceptions import ObjectDoesNotExist, ValidationError as DjangoValidationError
|
||||
from django.db import models
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.encoding import force_text
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.utils.encoding import force_str
|
||||
from django.utils.text import capfirst
|
||||
from django.utils.timezone import now
|
||||
from django.utils.functional import cached_property
|
||||
@ -57,6 +57,7 @@ from awx.main.models import (
|
||||
Host,
|
||||
Instance,
|
||||
InstanceGroup,
|
||||
InstanceLink,
|
||||
Inventory,
|
||||
InventorySource,
|
||||
InventoryUpdate,
|
||||
@ -96,7 +97,7 @@ from awx.main.models import (
|
||||
)
|
||||
from awx.main.models.base import VERBOSITY_CHOICES, NEW_JOB_TYPE_CHOICES
|
||||
from awx.main.models.rbac import get_roles_on_resource, role_summary_fields_generator
|
||||
from awx.main.fields import ImplicitRoleField, JSONBField
|
||||
from awx.main.fields import ImplicitRoleField
|
||||
from awx.main.utils import (
|
||||
get_type_for_model,
|
||||
get_model_for_type,
|
||||
@ -112,6 +113,7 @@ from awx.main.utils import (
|
||||
)
|
||||
from awx.main.utils.filters import SmartFilter
|
||||
from awx.main.utils.named_url_graph import reset_counters
|
||||
from awx.main.scheduler.task_manager_models import TaskManagerInstanceGroups, TaskManagerInstances
|
||||
from awx.main.redact import UriCleaner, REPLACE_STR
|
||||
|
||||
from awx.main.validators import vars_validate_or_raise
|
||||
@ -356,7 +358,7 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
|
||||
}
|
||||
choices = []
|
||||
for t in self.get_types():
|
||||
name = _(type_name_map.get(t, force_text(get_model_for_type(t)._meta.verbose_name).title()))
|
||||
name = _(type_name_map.get(t, force_str(get_model_for_type(t)._meta.verbose_name).title()))
|
||||
choices.append((t, name))
|
||||
return choices
|
||||
|
||||
@ -378,19 +380,22 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
|
||||
def _get_related(self, obj):
|
||||
return {} if obj is None else self.get_related(obj)
|
||||
|
||||
def _generate_named_url(self, url_path, obj, node):
|
||||
url_units = url_path.split('/')
|
||||
def _generate_friendly_id(self, obj, node):
|
||||
reset_counters()
|
||||
named_url = node.generate_named_url(obj)
|
||||
url_units[4] = named_url
|
||||
return '/'.join(url_units)
|
||||
return node.generate_named_url(obj)
|
||||
|
||||
def get_related(self, obj):
|
||||
res = OrderedDict()
|
||||
view = self.context.get('view', None)
|
||||
if view and (hasattr(view, 'retrieve') or view.request.method == 'POST') and type(obj) in settings.NAMED_URL_GRAPH:
|
||||
original_url = self.get_url(obj)
|
||||
res['named_url'] = self._generate_named_url(original_url, obj, settings.NAMED_URL_GRAPH[type(obj)])
|
||||
original_path = self.get_url(obj)
|
||||
path_components = original_path.lstrip('/').rstrip('/').split('/')
|
||||
|
||||
friendly_id = self._generate_friendly_id(obj, settings.NAMED_URL_GRAPH[type(obj)])
|
||||
path_components[-1] = friendly_id
|
||||
|
||||
new_path = '/' + '/'.join(path_components) + '/'
|
||||
res['named_url'] = new_path
|
||||
if getattr(obj, 'created_by', None):
|
||||
res['created_by'] = self.reverse('api:user_detail', kwargs={'pk': obj.created_by.pk})
|
||||
if getattr(obj, 'modified_by', None):
|
||||
@ -641,7 +646,7 @@ class BaseSerializer(serializers.ModelSerializer, metaclass=BaseSerializerMetacl
|
||||
v2.extend(e)
|
||||
else:
|
||||
v2.append(e)
|
||||
d[k] = list(map(force_text, v2))
|
||||
d[k] = list(map(force_str, v2))
|
||||
raise ValidationError(d)
|
||||
return attrs
|
||||
|
||||
@ -861,7 +866,7 @@ class UnifiedJobSerializer(BaseSerializer):
|
||||
if 'elapsed' in ret:
|
||||
if obj and obj.pk and obj.started and not obj.finished:
|
||||
td = now() - obj.started
|
||||
ret['elapsed'] = (td.microseconds + (td.seconds + td.days * 24 * 3600) * 10 ** 6) / (10 ** 6 * 1.0)
|
||||
ret['elapsed'] = (td.microseconds + (td.seconds + td.days * 24 * 3600) * 10**6) / (10**6 * 1.0)
|
||||
ret['elapsed'] = float(ret['elapsed'])
|
||||
# Because this string is saved in the db in the source language,
|
||||
# it must be marked for translation after it is pulled from the db, not when set
|
||||
@ -1259,6 +1264,12 @@ class OAuth2ApplicationSerializer(BaseSerializer):
|
||||
activity_stream=self.reverse('api:o_auth2_application_activity_stream_list', kwargs={'pk': obj.pk}),
|
||||
)
|
||||
)
|
||||
if obj.organization_id:
|
||||
res.update(
|
||||
dict(
|
||||
organization=self.reverse('api:organization_detail', kwargs={'pk': obj.organization_id}),
|
||||
)
|
||||
)
|
||||
return res
|
||||
|
||||
def get_modified(self, obj):
|
||||
@ -1596,7 +1607,6 @@ class ProjectUpdateSerializer(UnifiedJobSerializer, ProjectOptionsSerializer):
|
||||
|
||||
class ProjectUpdateDetailSerializer(ProjectUpdateSerializer):
|
||||
|
||||
host_status_counts = serializers.SerializerMethodField(help_text=_('A count of hosts uniquely assigned to each status.'))
|
||||
playbook_counts = serializers.SerializerMethodField(help_text=_('A count of all plays and tasks for the job run.'))
|
||||
|
||||
class Meta:
|
||||
@ -1611,14 +1621,6 @@ class ProjectUpdateDetailSerializer(ProjectUpdateSerializer):
|
||||
|
||||
return data
|
||||
|
||||
def get_host_status_counts(self, obj):
|
||||
try:
|
||||
counts = obj.project_update_events.only('event_data').get(event='playbook_on_stats').get_host_status_counts()
|
||||
except ProjectUpdateEvent.DoesNotExist:
|
||||
counts = {}
|
||||
|
||||
return counts
|
||||
|
||||
|
||||
class ProjectUpdateListSerializer(ProjectUpdateSerializer, UnifiedJobListSerializer):
|
||||
class Meta:
|
||||
@ -1639,7 +1641,25 @@ class BaseSerializerWithVariables(BaseSerializer):
|
||||
return vars_validate_or_raise(value)
|
||||
|
||||
|
||||
class InventorySerializer(BaseSerializerWithVariables):
|
||||
class LabelsListMixin(object):
|
||||
def _summary_field_labels(self, obj):
|
||||
label_list = [{'id': x.id, 'name': x.name} for x in obj.labels.all()[:10]]
|
||||
if has_model_field_prefetched(obj, 'labels'):
|
||||
label_ct = len(obj.labels.all())
|
||||
else:
|
||||
if len(label_list) < 10:
|
||||
label_ct = len(label_list)
|
||||
else:
|
||||
label_ct = obj.labels.count()
|
||||
return {'count': label_ct, 'results': label_list}
|
||||
|
||||
def get_summary_fields(self, obj):
|
||||
res = super(LabelsListMixin, self).get_summary_fields(obj)
|
||||
res['labels'] = self._summary_field_labels(obj)
|
||||
return res
|
||||
|
||||
|
||||
class InventorySerializer(LabelsListMixin, BaseSerializerWithVariables):
|
||||
show_capabilities = ['edit', 'delete', 'adhoc', 'copy']
|
||||
capabilities_prefetch = ['admin', 'adhoc', {'copy': 'organization.inventory_admin'}]
|
||||
|
||||
@ -1680,6 +1700,7 @@ class InventorySerializer(BaseSerializerWithVariables):
|
||||
object_roles=self.reverse('api:inventory_object_roles_list', kwargs={'pk': obj.pk}),
|
||||
instance_groups=self.reverse('api:inventory_instance_groups_list', kwargs={'pk': obj.pk}),
|
||||
copy=self.reverse('api:inventory_copy', kwargs={'pk': obj.pk}),
|
||||
labels=self.reverse('api:inventory_label_list', kwargs={'pk': obj.pk}),
|
||||
)
|
||||
)
|
||||
if obj.organization:
|
||||
@ -1695,7 +1716,7 @@ class InventorySerializer(BaseSerializerWithVariables):
|
||||
def validate_host_filter(self, host_filter):
|
||||
if host_filter:
|
||||
try:
|
||||
for match in JSONBField.get_lookups().keys():
|
||||
for match in models.JSONField.get_lookups().keys():
|
||||
if match == 'exact':
|
||||
# __exact is allowed
|
||||
continue
|
||||
@ -1824,11 +1845,11 @@ class HostSerializer(BaseSerializerWithVariables):
|
||||
if port < 1 or port > 65535:
|
||||
raise ValueError
|
||||
except ValueError:
|
||||
raise serializers.ValidationError(_(u'Invalid port specification: %s') % force_text(port))
|
||||
raise serializers.ValidationError(_(u'Invalid port specification: %s') % force_str(port))
|
||||
return name, port
|
||||
|
||||
def validate_name(self, value):
|
||||
name = force_text(value or '')
|
||||
name = force_str(value or '')
|
||||
# Validate here only, update in main validate method.
|
||||
host, port = self._get_host_port_from_name(name)
|
||||
return value
|
||||
@ -1842,13 +1863,13 @@ class HostSerializer(BaseSerializerWithVariables):
|
||||
return vars_validate_or_raise(value)
|
||||
|
||||
def validate(self, attrs):
|
||||
name = force_text(attrs.get('name', self.instance and self.instance.name or ''))
|
||||
name = force_str(attrs.get('name', self.instance and self.instance.name or ''))
|
||||
inventory = attrs.get('inventory', self.instance and self.instance.inventory or '')
|
||||
host, port = self._get_host_port_from_name(name)
|
||||
|
||||
if port:
|
||||
attrs['name'] = host
|
||||
variables = force_text(attrs.get('variables', self.instance and self.instance.variables or ''))
|
||||
variables = force_str(attrs.get('variables', self.instance and self.instance.variables or ''))
|
||||
vars_dict = parse_yaml_or_json(variables)
|
||||
vars_dict['ansible_ssh_port'] = port
|
||||
attrs['variables'] = json.dumps(vars_dict)
|
||||
@ -1921,7 +1942,7 @@ class GroupSerializer(BaseSerializerWithVariables):
|
||||
return res
|
||||
|
||||
def validate(self, attrs):
|
||||
name = force_text(attrs.get('name', self.instance and self.instance.name or ''))
|
||||
name = force_str(attrs.get('name', self.instance and self.instance.name or ''))
|
||||
inventory = attrs.get('inventory', self.instance and self.instance.inventory or '')
|
||||
if Host.objects.filter(name=name, inventory=inventory).exists():
|
||||
raise serializers.ValidationError(_('A Host with that name already exists.'))
|
||||
@ -2215,7 +2236,6 @@ class InventoryUpdateSerializer(UnifiedJobSerializer, InventorySourceOptionsSeri
|
||||
'source_project_update',
|
||||
'custom_virtualenv',
|
||||
'instance_group',
|
||||
'-controller_node',
|
||||
)
|
||||
|
||||
def get_related(self, obj):
|
||||
@ -2290,7 +2310,6 @@ class InventoryUpdateDetailSerializer(InventoryUpdateSerializer):
|
||||
class InventoryUpdateListSerializer(InventoryUpdateSerializer, UnifiedJobListSerializer):
|
||||
class Meta:
|
||||
model = InventoryUpdate
|
||||
fields = ('*', '-controller_node') # field removal undone by UJ serializer
|
||||
|
||||
|
||||
class InventoryUpdateCancelSerializer(InventoryUpdateSerializer):
|
||||
@ -2643,6 +2662,13 @@ class CredentialSerializer(BaseSerializer):
|
||||
|
||||
return credential_type
|
||||
|
||||
def validate_inputs(self, inputs):
|
||||
if self.instance and self.instance.credential_type.kind == "vault":
|
||||
if 'vault_id' in inputs and inputs['vault_id'] != self.instance.inputs['vault_id']:
|
||||
raise ValidationError(_('Vault IDs cannot be changed once they have been created.'))
|
||||
|
||||
return inputs
|
||||
|
||||
|
||||
class CredentialSerializerCreate(CredentialSerializer):
|
||||
|
||||
@ -2749,24 +2775,6 @@ class OrganizationCredentialSerializerCreate(CredentialSerializerCreate):
|
||||
fields = ('*', '-user', '-team')
|
||||
|
||||
|
||||
class LabelsListMixin(object):
|
||||
def _summary_field_labels(self, obj):
|
||||
label_list = [{'id': x.id, 'name': x.name} for x in obj.labels.all()[:10]]
|
||||
if has_model_field_prefetched(obj, 'labels'):
|
||||
label_ct = len(obj.labels.all())
|
||||
else:
|
||||
if len(label_list) < 10:
|
||||
label_ct = len(label_list)
|
||||
else:
|
||||
label_ct = obj.labels.count()
|
||||
return {'count': label_ct, 'results': label_list}
|
||||
|
||||
def get_summary_fields(self, obj):
|
||||
res = super(LabelsListMixin, self).get_summary_fields(obj)
|
||||
res['labels'] = self._summary_field_labels(obj)
|
||||
return res
|
||||
|
||||
|
||||
class JobOptionsSerializer(LabelsListMixin, BaseSerializer):
|
||||
class Meta:
|
||||
fields = (
|
||||
@ -2833,8 +2841,8 @@ class JobOptionsSerializer(LabelsListMixin, BaseSerializer):
|
||||
if not project:
|
||||
raise serializers.ValidationError({'project': _('This field is required.')})
|
||||
playbook_not_found = bool(
|
||||
(project and project.scm_type and (not project.allow_override) and playbook and force_text(playbook) not in project.playbook_files)
|
||||
or (project and not project.scm_type and playbook and force_text(playbook) not in project.playbooks) # manual
|
||||
(project and project.scm_type and (not project.allow_override) and playbook and force_str(playbook) not in project.playbook_files)
|
||||
or (project and not project.scm_type and playbook and force_str(playbook) not in project.playbooks) # manual
|
||||
)
|
||||
if playbook_not_found:
|
||||
raise serializers.ValidationError({'playbook': _('Playbook not found for project.')})
|
||||
@ -3095,7 +3103,6 @@ class JobSerializer(UnifiedJobSerializer, JobOptionsSerializer):
|
||||
|
||||
class JobDetailSerializer(JobSerializer):
|
||||
|
||||
host_status_counts = serializers.SerializerMethodField(help_text=_('A count of hosts uniquely assigned to each status.'))
|
||||
playbook_counts = serializers.SerializerMethodField(help_text=_('A count of all plays and tasks for the job run.'))
|
||||
custom_virtualenv = serializers.ReadOnlyField()
|
||||
|
||||
@ -3111,14 +3118,6 @@ class JobDetailSerializer(JobSerializer):
|
||||
|
||||
return data
|
||||
|
||||
def get_host_status_counts(self, obj):
|
||||
try:
|
||||
counts = obj.get_event_queryset().only('event_data').get(event='playbook_on_stats').get_host_status_counts()
|
||||
except JobEvent.DoesNotExist:
|
||||
counts = {}
|
||||
|
||||
return counts
|
||||
|
||||
|
||||
class JobCancelSerializer(BaseSerializer):
|
||||
|
||||
@ -3307,21 +3306,10 @@ class AdHocCommandSerializer(UnifiedJobSerializer):
|
||||
|
||||
|
||||
class AdHocCommandDetailSerializer(AdHocCommandSerializer):
|
||||
|
||||
host_status_counts = serializers.SerializerMethodField(help_text=_('A count of hosts uniquely assigned to each status.'))
|
||||
|
||||
class Meta:
|
||||
model = AdHocCommand
|
||||
fields = ('*', 'host_status_counts')
|
||||
|
||||
def get_host_status_counts(self, obj):
|
||||
try:
|
||||
counts = obj.ad_hoc_command_events.only('event_data').get(event='playbook_on_stats').get_host_status_counts()
|
||||
except AdHocCommandEvent.DoesNotExist:
|
||||
counts = {}
|
||||
|
||||
return counts
|
||||
|
||||
|
||||
class AdHocCommandCancelSerializer(AdHocCommandSerializer):
|
||||
|
||||
@ -3623,7 +3611,7 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
|
||||
job_tags = serializers.CharField(allow_blank=True, allow_null=True, required=False, default=None)
|
||||
limit = serializers.CharField(allow_blank=True, allow_null=True, required=False, default=None)
|
||||
skip_tags = serializers.CharField(allow_blank=True, allow_null=True, required=False, default=None)
|
||||
diff_mode = serializers.NullBooleanField(required=False, default=None)
|
||||
diff_mode = serializers.BooleanField(required=False, allow_null=True, default=None)
|
||||
verbosity = serializers.ChoiceField(allow_null=True, required=False, default=None, choices=VERBOSITY_CHOICES)
|
||||
exclude_errors = ()
|
||||
|
||||
@ -4490,7 +4478,10 @@ class NotificationTemplateSerializer(BaseSerializer):
|
||||
body = messages[event].get('body', {})
|
||||
if body:
|
||||
try:
|
||||
potential_body = json.loads(body)
|
||||
rendered_body = (
|
||||
sandbox.ImmutableSandboxedEnvironment(undefined=DescriptiveUndefined).from_string(body).render(JobNotificationMixin.context_stub())
|
||||
)
|
||||
potential_body = json.loads(rendered_body)
|
||||
if not isinstance(potential_body, dict):
|
||||
error_list.append(
|
||||
_("Webhook body for '{}' should be a json dictionary. Found type '{}'.".format(event, type(potential_body).__name__))
|
||||
@ -4633,61 +4624,60 @@ class SchedulePreviewSerializer(BaseSerializer):
|
||||
|
||||
# We reject rrules if:
|
||||
# - DTSTART is not include
|
||||
# - INTERVAL is not included
|
||||
# - SECONDLY is used
|
||||
# - TZID is used
|
||||
# - BYDAY prefixed with a number (MO is good but not 20MO)
|
||||
# - BYYEARDAY
|
||||
# - BYWEEKNO
|
||||
# - Multiple DTSTART or RRULE elements
|
||||
# - Can't contain both COUNT and UNTIL
|
||||
# - COUNT > 999
|
||||
# - Multiple DTSTART
|
||||
# - At least one of RRULE is not included
|
||||
# - EXDATE or RDATE is included
|
||||
# For any rule in the ruleset:
|
||||
# - INTERVAL is not included
|
||||
# - SECONDLY is used
|
||||
# - BYDAY prefixed with a number (MO is good but not 20MO)
|
||||
# - Can't contain both COUNT and UNTIL
|
||||
# - COUNT > 999
|
||||
def validate_rrule(self, value):
|
||||
rrule_value = value
|
||||
multi_by_month_day = r".*?BYMONTHDAY[\:\=][0-9]+,-*[0-9]+"
|
||||
multi_by_month = r".*?BYMONTH[\:\=][0-9]+,[0-9]+"
|
||||
by_day_with_numeric_prefix = r".*?BYDAY[\:\=][0-9]+[a-zA-Z]{2}"
|
||||
match_count = re.match(r".*?(COUNT\=[0-9]+)", rrule_value)
|
||||
match_multiple_dtstart = re.findall(r".*?(DTSTART(;[^:]+)?\:[0-9]+T[0-9]+Z?)", rrule_value)
|
||||
match_native_dtstart = re.findall(r".*?(DTSTART:[0-9]+T[0-9]+) ", rrule_value)
|
||||
match_multiple_rrule = re.findall(r".*?(RRULE\:)", rrule_value)
|
||||
match_multiple_rrule = re.findall(r".*?(RULE\:[^\s]*)", rrule_value)
|
||||
errors = []
|
||||
if not len(match_multiple_dtstart):
|
||||
raise serializers.ValidationError(_('Valid DTSTART required in rrule. Value should start with: DTSTART:YYYYMMDDTHHMMSSZ'))
|
||||
errors.append(_('Valid DTSTART required in rrule. Value should start with: DTSTART:YYYYMMDDTHHMMSSZ'))
|
||||
if len(match_native_dtstart):
|
||||
raise serializers.ValidationError(_('DTSTART cannot be a naive datetime. Specify ;TZINFO= or YYYYMMDDTHHMMSSZZ.'))
|
||||
errors.append(_('DTSTART cannot be a naive datetime. Specify ;TZINFO= or YYYYMMDDTHHMMSSZZ.'))
|
||||
if len(match_multiple_dtstart) > 1:
|
||||
raise serializers.ValidationError(_('Multiple DTSTART is not supported.'))
|
||||
if not len(match_multiple_rrule):
|
||||
raise serializers.ValidationError(_('RRULE required in rrule.'))
|
||||
if len(match_multiple_rrule) > 1:
|
||||
raise serializers.ValidationError(_('Multiple RRULE is not supported.'))
|
||||
if 'interval' not in rrule_value.lower():
|
||||
raise serializers.ValidationError(_('INTERVAL required in rrule.'))
|
||||
if 'secondly' in rrule_value.lower():
|
||||
raise serializers.ValidationError(_('SECONDLY is not supported.'))
|
||||
if re.match(multi_by_month_day, rrule_value):
|
||||
raise serializers.ValidationError(_('Multiple BYMONTHDAYs not supported.'))
|
||||
if re.match(multi_by_month, rrule_value):
|
||||
raise serializers.ValidationError(_('Multiple BYMONTHs not supported.'))
|
||||
if re.match(by_day_with_numeric_prefix, rrule_value):
|
||||
raise serializers.ValidationError(_("BYDAY with numeric prefix not supported."))
|
||||
if 'byyearday' in rrule_value.lower():
|
||||
raise serializers.ValidationError(_("BYYEARDAY not supported."))
|
||||
if 'byweekno' in rrule_value.lower():
|
||||
raise serializers.ValidationError(_("BYWEEKNO not supported."))
|
||||
if 'COUNT' in rrule_value and 'UNTIL' in rrule_value:
|
||||
raise serializers.ValidationError(_("RRULE may not contain both COUNT and UNTIL"))
|
||||
if match_count:
|
||||
count_val = match_count.groups()[0].strip().split("=")
|
||||
if int(count_val[1]) > 999:
|
||||
raise serializers.ValidationError(_("COUNT > 999 is unsupported."))
|
||||
errors.append(_('Multiple DTSTART is not supported.'))
|
||||
if "rrule:" not in rrule_value.lower():
|
||||
errors.append(_('One or more rule required in rrule.'))
|
||||
if "exdate:" in rrule_value.lower():
|
||||
raise serializers.ValidationError(_('EXDATE not allowed in rrule.'))
|
||||
if "rdate:" in rrule_value.lower():
|
||||
raise serializers.ValidationError(_('RDATE not allowed in rrule.'))
|
||||
for a_rule in match_multiple_rrule:
|
||||
if 'interval' not in a_rule.lower():
|
||||
errors.append("{0}: {1}".format(_('INTERVAL required in rrule'), a_rule))
|
||||
elif 'secondly' in a_rule.lower():
|
||||
errors.append("{0}: {1}".format(_('SECONDLY is not supported'), a_rule))
|
||||
if re.match(by_day_with_numeric_prefix, a_rule):
|
||||
errors.append("{0}: {1}".format(_("BYDAY with numeric prefix not supported"), a_rule))
|
||||
if 'COUNT' in a_rule and 'UNTIL' in a_rule:
|
||||
errors.append("{0}: {1}".format(_("RRULE may not contain both COUNT and UNTIL"), a_rule))
|
||||
match_count = re.match(r".*?(COUNT\=[0-9]+)", a_rule)
|
||||
if match_count:
|
||||
count_val = match_count.groups()[0].strip().split("=")
|
||||
if int(count_val[1]) > 999:
|
||||
errors.append("{0}: {1}".format(_("COUNT > 999 is unsupported"), a_rule))
|
||||
|
||||
try:
|
||||
Schedule.rrulestr(rrule_value)
|
||||
except Exception as e:
|
||||
import traceback
|
||||
|
||||
logger.error(traceback.format_exc())
|
||||
raise serializers.ValidationError(_("rrule parsing failed validation: {}").format(e))
|
||||
errors.append(_("rrule parsing failed validation: {}").format(e))
|
||||
|
||||
if errors:
|
||||
raise serializers.ValidationError(errors)
|
||||
|
||||
return value
|
||||
|
||||
|
||||
@ -4767,6 +4757,28 @@ class ScheduleSerializer(LaunchConfigurationBaseSerializer, SchedulePreviewSeria
|
||||
return super(ScheduleSerializer, self).validate(attrs)
|
||||
|
||||
|
||||
class InstanceLinkSerializer(BaseSerializer):
|
||||
class Meta:
|
||||
model = InstanceLink
|
||||
fields = ('source', 'target')
|
||||
|
||||
source = serializers.SlugRelatedField(slug_field="hostname", read_only=True)
|
||||
target = serializers.SlugRelatedField(slug_field="hostname", read_only=True)
|
||||
|
||||
|
||||
class InstanceNodeSerializer(BaseSerializer):
|
||||
class Meta:
|
||||
model = Instance
|
||||
fields = ('id', 'hostname', 'node_type', 'node_state')
|
||||
|
||||
node_state = serializers.SerializerMethodField()
|
||||
|
||||
def get_node_state(self, obj):
|
||||
if not obj.enabled:
|
||||
return "disabled"
|
||||
return "error" if obj.errors else "healthy"
|
||||
|
||||
|
||||
class InstanceSerializer(BaseSerializer):
|
||||
|
||||
consumed_capacity = serializers.SerializerMethodField()
|
||||
@ -4810,7 +4822,8 @@ class InstanceSerializer(BaseSerializer):
|
||||
res['jobs'] = self.reverse('api:instance_unified_jobs_list', kwargs={'pk': obj.pk})
|
||||
res['instance_groups'] = self.reverse('api:instance_instance_groups_list', kwargs={'pk': obj.pk})
|
||||
if self.context['request'].user.is_superuser or self.context['request'].user.is_system_auditor:
|
||||
res['health_check'] = self.reverse('api:instance_health_check', kwargs={'pk': obj.pk})
|
||||
if obj.node_type != 'hop':
|
||||
res['health_check'] = self.reverse('api:instance_health_check', kwargs={'pk': obj.pk})
|
||||
return res
|
||||
|
||||
def get_consumed_capacity(self, obj):
|
||||
@ -4822,6 +4835,11 @@ class InstanceSerializer(BaseSerializer):
|
||||
else:
|
||||
return float("{0:.2f}".format(((float(obj.capacity) - float(obj.consumed_capacity)) / (float(obj.capacity))) * 100))
|
||||
|
||||
def validate(self, attrs):
|
||||
if self.instance.node_type == 'hop':
|
||||
raise serializers.ValidationError(_('Hop node instances may not be changed.'))
|
||||
return attrs
|
||||
|
||||
|
||||
class InstanceHealthCheckSerializer(BaseSerializer):
|
||||
class Meta:
|
||||
@ -4834,7 +4852,6 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
|
||||
show_capabilities = ['edit', 'delete']
|
||||
|
||||
committed_capacity = serializers.SerializerMethodField()
|
||||
consumed_capacity = serializers.SerializerMethodField()
|
||||
percent_capacity_remaining = serializers.SerializerMethodField()
|
||||
jobs_running = serializers.IntegerField(
|
||||
@ -4883,7 +4900,6 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
"created",
|
||||
"modified",
|
||||
"capacity",
|
||||
"committed_capacity",
|
||||
"consumed_capacity",
|
||||
"percent_capacity_remaining",
|
||||
"jobs_running",
|
||||
@ -4908,6 +4924,9 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
return res
|
||||
|
||||
def validate_policy_instance_list(self, value):
|
||||
if self.instance and self.instance.name in [settings.DEFAULT_EXECUTION_QUEUE_NAME, settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME]:
|
||||
if self.instance.policy_instance_list != value:
|
||||
raise serializers.ValidationError(_('%s instance group policy_instance_list may not be changed.' % self.instance.name))
|
||||
for instance_name in value:
|
||||
if value.count(instance_name) > 1:
|
||||
raise serializers.ValidationError(_('Duplicate entry {}.').format(instance_name))
|
||||
@ -4918,6 +4937,11 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
return value
|
||||
|
||||
def validate_policy_instance_percentage(self, value):
|
||||
if self.instance and self.instance.name in [settings.DEFAULT_EXECUTION_QUEUE_NAME, settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME]:
|
||||
if value != self.instance.policy_instance_percentage:
|
||||
raise serializers.ValidationError(
|
||||
_('%s instance group policy_instance_percentage may not be changed from the initial value set by the installer.' % self.instance.name)
|
||||
)
|
||||
if value and self.instance and self.instance.is_container_group:
|
||||
raise serializers.ValidationError(_('Containerized instances may not be managed via the API'))
|
||||
return value
|
||||
@ -4936,6 +4960,13 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
|
||||
return value
|
||||
|
||||
def validate_is_container_group(self, value):
|
||||
if self.instance and self.instance.name in [settings.DEFAULT_EXECUTION_QUEUE_NAME, settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME]:
|
||||
if value != self.instance.is_container_group:
|
||||
raise serializers.ValidationError(_('%s instance group is_container_group may not be changed.' % self.instance.name))
|
||||
|
||||
return value
|
||||
|
||||
def validate_credential(self, value):
|
||||
if value and not value.kubernetes:
|
||||
raise serializers.ValidationError(_('Only Kubernetes credentials can be associated with an Instance Group'))
|
||||
@ -4949,30 +4980,29 @@ class InstanceGroupSerializer(BaseSerializer):
|
||||
|
||||
return attrs
|
||||
|
||||
def get_capacity_dict(self):
|
||||
def get_ig_mgr(self):
|
||||
# Store capacity values (globally computed) in the context
|
||||
if 'capacity_map' not in self.context:
|
||||
ig_qs = None
|
||||
if 'task_manager_igs' not in self.context:
|
||||
instance_groups_queryset = None
|
||||
jobs_qs = UnifiedJob.objects.filter(status__in=('running', 'waiting'))
|
||||
if self.parent: # Is ListView:
|
||||
ig_qs = self.parent.instance
|
||||
self.context['capacity_map'] = InstanceGroup.objects.capacity_values(qs=ig_qs, tasks=jobs_qs, breakdown=True)
|
||||
return self.context['capacity_map']
|
||||
instance_groups_queryset = self.parent.instance
|
||||
|
||||
instances = TaskManagerInstances(jobs_qs)
|
||||
instance_groups = TaskManagerInstanceGroups(instances_by_hostname=instances, instance_groups_queryset=instance_groups_queryset)
|
||||
|
||||
self.context['task_manager_igs'] = instance_groups
|
||||
return self.context['task_manager_igs']
|
||||
|
||||
def get_consumed_capacity(self, obj):
|
||||
return self.get_capacity_dict()[obj.name]['running_capacity']
|
||||
|
||||
def get_committed_capacity(self, obj):
|
||||
return self.get_capacity_dict()[obj.name]['committed_capacity']
|
||||
ig_mgr = self.get_ig_mgr()
|
||||
return ig_mgr.get_consumed_capacity(obj.name)
|
||||
|
||||
def get_percent_capacity_remaining(self, obj):
|
||||
if not obj.capacity:
|
||||
return 0.0
|
||||
consumed = self.get_consumed_capacity(obj)
|
||||
if consumed >= obj.capacity:
|
||||
return 0.0
|
||||
else:
|
||||
return float("{0:.2f}".format(((float(obj.capacity) - float(consumed)) / (float(obj.capacity))) * 100))
|
||||
ig_mgr = self.get_ig_mgr()
|
||||
return float("{0:.2f}".format((float(ig_mgr.get_remaining_capacity(obj.name)) / (float(obj.capacity))) * 100))
|
||||
|
||||
def get_instances(self, obj):
|
||||
return obj.instances.count()
|
||||
@ -5050,7 +5080,7 @@ class ActivityStreamSerializer(BaseSerializer):
|
||||
try:
|
||||
return json.loads(obj.changes)
|
||||
except Exception:
|
||||
logger.warn("Error deserializing activity stream json changes")
|
||||
logger.warning("Error deserializing activity stream json changes")
|
||||
return {}
|
||||
|
||||
def get_object_association(self, obj):
|
||||
|
||||
102
awx/api/templates/api/job_job_events_children_summary.md
Normal file
102
awx/api/templates/api/job_job_events_children_summary.md
Normal file
@ -0,0 +1,102 @@
|
||||
# View a summary of children events
|
||||
|
||||
Special view to facilitate processing job output in the UI.
|
||||
In order to collapse events and their children, the UI needs to know how
|
||||
many children exist for a given event.
|
||||
The UI also needs to know the order of the event (0 based index), which
|
||||
usually matches the counter, but not always.
|
||||
This view returns a JSON object where the key is the event counter, and the value
|
||||
includes the number of children (and grandchildren) events.
|
||||
Only events with children are included in the output.
|
||||
|
||||
## Example
|
||||
|
||||
e.g. Demo Job Template job
|
||||
tuple(event counter, uuid, parent_uuid)
|
||||
|
||||
```
|
||||
(1, '4598d19e-93b4-4e33-a0ae-b387a7348964', '')
|
||||
(2, 'aae0d189-e3cb-102a-9f00-000000000006', '4598d19e-93b4-4e33-a0ae-b387a7348964')
|
||||
(3, 'aae0d189-e3cb-102a-9f00-00000000000c', 'aae0d189-e3cb-102a-9f00-000000000006')
|
||||
(4, 'f4194f14-e406-4124-8519-0fdb08b18f4b', 'aae0d189-e3cb-102a-9f00-00000000000c')
|
||||
(5, '39f7ad99-dbf3-41e0-93f8-9999db4004f2', 'aae0d189-e3cb-102a-9f00-00000000000c')
|
||||
(6, 'aae0d189-e3cb-102a-9f00-000000000008', 'aae0d189-e3cb-102a-9f00-000000000006')
|
||||
(7, '39a49992-5ca4-4b6c-b178-e56d0b0333da', 'aae0d189-e3cb-102a-9f00-000000000008')
|
||||
(8, '504f3b28-3ea8-4f6f-bd82-60cf8e807cc0', 'aae0d189-e3cb-102a-9f00-000000000008')
|
||||
(9, 'a242be54-ebe6-4021-afab-f2878bff2e9f', '4598d19e-93b4-4e33-a0ae-b387a7348964')
|
||||
```
|
||||
|
||||
output
|
||||
|
||||
```
|
||||
{
|
||||
"1": {
|
||||
"rowNumber": 0,
|
||||
"numChildren": 8
|
||||
},
|
||||
"2": {
|
||||
"rowNumber": 1,
|
||||
"numChildren": 6
|
||||
},
|
||||
"3": {
|
||||
"rowNumber": 2,
|
||||
"numChildren": 2
|
||||
},
|
||||
"6": {
|
||||
"rowNumber": 5,
|
||||
"numChildren": 2
|
||||
}
|
||||
}
|
||||
"meta_event_nested_parent_uuid": {}
|
||||
}
|
||||
```
|
||||
|
||||
counter 1 is event 0, and has 8 children
|
||||
counter 2 is event 1, and has 6 children
|
||||
etc.
|
||||
|
||||
The UI also needs to be able to collapse over "meta" events -- events that
|
||||
show up due to verbosity or warnings from the system while the play is running.
|
||||
These events have a 0 level event, with no parent uuid.
|
||||
|
||||
```
|
||||
playbook_on_start
|
||||
verbose
|
||||
playbook_on_play_start
|
||||
playbook_on_task_start
|
||||
runner_on_start <- level 3
|
||||
verbose <- jump to level 0
|
||||
verbose
|
||||
runner_on_ok <- jump back to level 3
|
||||
playbook_on_task_start
|
||||
runner_on_start
|
||||
runner_on_ok
|
||||
verbose
|
||||
verbose
|
||||
playbook_on_stats
|
||||
```
|
||||
|
||||
These verbose statements that fall in the middle of a series of children events
|
||||
are problematic for the UI.
|
||||
To help, this view will attempt to place the events into the hierarchy, without
|
||||
the event level jumps.
|
||||
|
||||
```
|
||||
playbook_on_start
|
||||
verbose
|
||||
playbook_on_play_start
|
||||
playbook_on_task_start
|
||||
runner_on_start <- A
|
||||
verbose <- this maps to the uuid of A
|
||||
verbose
|
||||
runner_on_ok
|
||||
playbook_on_task_start <- B
|
||||
runner_on_start
|
||||
runner_on_ok
|
||||
verbose <- this maps to the uuid of B
|
||||
verbose
|
||||
playbook_on_stats
|
||||
```
|
||||
|
||||
The output will include a JSON object where the key is the event counter,
|
||||
and the value is the assigned nested uuid.
|
||||
1
awx/api/templates/api/mesh_visualizer.md
Normal file
1
awx/api/templates/api/mesh_visualizer.md
Normal file
@ -0,0 +1 @@
|
||||
Make a GET request to this resource to obtain a list all Receptor Nodes and their links.
|
||||
@ -1,14 +1,14 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import ActivityStreamList, ActivityStreamDetail
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ActivityStreamList.as_view(), name='activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', ActivityStreamDetail.as_view(), name='activity_stream_detail'),
|
||||
re_path(r'^$', ActivityStreamList.as_view(), name='activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', ActivityStreamDetail.as_view(), name='activity_stream_detail'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
AdHocCommandList,
|
||||
@ -16,14 +16,14 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', AdHocCommandList.as_view(), name='ad_hoc_command_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', AdHocCommandDetail.as_view(), name='ad_hoc_command_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', AdHocCommandCancel.as_view(), name='ad_hoc_command_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/relaunch/$', AdHocCommandRelaunch.as_view(), name='ad_hoc_command_relaunch'),
|
||||
url(r'^(?P<pk>[0-9]+)/events/$', AdHocCommandAdHocCommandEventsList.as_view(), name='ad_hoc_command_ad_hoc_command_events_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', AdHocCommandActivityStreamList.as_view(), name='ad_hoc_command_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', AdHocCommandNotificationsList.as_view(), name='ad_hoc_command_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/stdout/$', AdHocCommandStdout.as_view(), name='ad_hoc_command_stdout'),
|
||||
re_path(r'^$', AdHocCommandList.as_view(), name='ad_hoc_command_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', AdHocCommandDetail.as_view(), name='ad_hoc_command_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', AdHocCommandCancel.as_view(), name='ad_hoc_command_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/relaunch/$', AdHocCommandRelaunch.as_view(), name='ad_hoc_command_relaunch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/events/$', AdHocCommandAdHocCommandEventsList.as_view(), name='ad_hoc_command_ad_hoc_command_events_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', AdHocCommandActivityStreamList.as_view(), name='ad_hoc_command_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', AdHocCommandNotificationsList.as_view(), name='ad_hoc_command_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/stdout/$', AdHocCommandStdout.as_view(), name='ad_hoc_command_stdout'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,13 +1,13 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import AdHocCommandEventDetail
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^(?P<pk>[0-9]+)/$', AdHocCommandEventDetail.as_view(), name='ad_hoc_command_event_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', AdHocCommandEventDetail.as_view(), name='ad_hoc_command_event_detail'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
CredentialList,
|
||||
@ -18,16 +18,16 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', CredentialList.as_view(), name='credential_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialActivityStreamList.as_view(), name='credential_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', CredentialDetail.as_view(), name='credential_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', CredentialAccessList.as_view(), name='credential_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', CredentialObjectRolesList.as_view(), name='credential_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/owner_users/$', CredentialOwnerUsersList.as_view(), name='credential_owner_users_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/owner_teams/$', CredentialOwnerTeamsList.as_view(), name='credential_owner_teams_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', CredentialCopy.as_view(), name='credential_copy'),
|
||||
url(r'^(?P<pk>[0-9]+)/input_sources/$', CredentialInputSourceSubList.as_view(), name='credential_input_source_sublist'),
|
||||
url(r'^(?P<pk>[0-9]+)/test/$', CredentialExternalTest.as_view(), name='credential_external_test'),
|
||||
re_path(r'^$', CredentialList.as_view(), name='credential_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialActivityStreamList.as_view(), name='credential_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', CredentialDetail.as_view(), name='credential_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', CredentialAccessList.as_view(), name='credential_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', CredentialObjectRolesList.as_view(), name='credential_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/owner_users/$', CredentialOwnerUsersList.as_view(), name='credential_owner_users_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/owner_teams/$', CredentialOwnerTeamsList.as_view(), name='credential_owner_teams_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', CredentialCopy.as_view(), name='credential_copy'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/input_sources/$', CredentialInputSourceSubList.as_view(), name='credential_input_source_sublist'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/test/$', CredentialExternalTest.as_view(), name='credential_external_test'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,14 +1,14 @@
|
||||
# Copyright (c) 2019 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import CredentialInputSourceDetail, CredentialInputSourceList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', CredentialInputSourceList.as_view(), name='credential_input_source_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', CredentialInputSourceDetail.as_view(), name='credential_input_source_detail'),
|
||||
re_path(r'^$', CredentialInputSourceList.as_view(), name='credential_input_source_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', CredentialInputSourceDetail.as_view(), name='credential_input_source_detail'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,17 +1,17 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import CredentialTypeList, CredentialTypeDetail, CredentialTypeCredentialList, CredentialTypeActivityStreamList, CredentialTypeExternalTest
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', CredentialTypeList.as_view(), name='credential_type_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', CredentialTypeDetail.as_view(), name='credential_type_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', CredentialTypeCredentialList.as_view(), name='credential_type_credential_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialTypeActivityStreamList.as_view(), name='credential_type_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/test/$', CredentialTypeExternalTest.as_view(), name='credential_type_external_test'),
|
||||
re_path(r'^$', CredentialTypeList.as_view(), name='credential_type_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', CredentialTypeDetail.as_view(), name='credential_type_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', CredentialTypeCredentialList.as_view(), name='credential_type_credential_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', CredentialTypeActivityStreamList.as_view(), name='credential_type_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/test/$', CredentialTypeExternalTest.as_view(), name='credential_type_external_test'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
ExecutionEnvironmentList,
|
||||
@ -10,11 +10,11 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ExecutionEnvironmentList.as_view(), name='execution_environment_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', ExecutionEnvironmentDetail.as_view(), name='execution_environment_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/unified_job_templates/$', ExecutionEnvironmentJobTemplateList.as_view(), name='execution_environment_job_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', ExecutionEnvironmentCopy.as_view(), name='execution_environment_copy'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', ExecutionEnvironmentActivityStreamList.as_view(), name='execution_environment_activity_stream_list'),
|
||||
re_path(r'^$', ExecutionEnvironmentList.as_view(), name='execution_environment_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', ExecutionEnvironmentDetail.as_view(), name='execution_environment_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/unified_job_templates/$', ExecutionEnvironmentJobTemplateList.as_view(), name='execution_environment_job_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', ExecutionEnvironmentCopy.as_view(), name='execution_environment_copy'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', ExecutionEnvironmentActivityStreamList.as_view(), name='execution_environment_activity_stream_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
GroupList,
|
||||
@ -20,18 +20,18 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', GroupList.as_view(), name='group_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', GroupDetail.as_view(), name='group_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/children/$', GroupChildrenList.as_view(), name='group_children_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/hosts/$', GroupHostsList.as_view(), name='group_hosts_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/all_hosts/$', GroupAllHostsList.as_view(), name='group_all_hosts_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/variable_data/$', GroupVariableData.as_view(), name='group_variable_data'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_events/$', GroupJobEventsList.as_view(), name='group_job_events_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', GroupJobHostSummariesList.as_view(), name='group_job_host_summaries_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/potential_children/$', GroupPotentialChildrenList.as_view(), name='group_potential_children_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', GroupActivityStreamList.as_view(), name='group_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', GroupInventorySourcesList.as_view(), name='group_inventory_sources_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', GroupAdHocCommandsList.as_view(), name='group_ad_hoc_commands_list'),
|
||||
re_path(r'^$', GroupList.as_view(), name='group_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', GroupDetail.as_view(), name='group_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/children/$', GroupChildrenList.as_view(), name='group_children_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/hosts/$', GroupHostsList.as_view(), name='group_hosts_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/all_hosts/$', GroupAllHostsList.as_view(), name='group_all_hosts_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/variable_data/$', GroupVariableData.as_view(), name='group_variable_data'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_events/$', GroupJobEventsList.as_view(), name='group_job_events_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_host_summaries/$', GroupJobHostSummariesList.as_view(), name='group_job_host_summaries_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/potential_children/$', GroupPotentialChildrenList.as_view(), name='group_potential_children_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', GroupActivityStreamList.as_view(), name='group_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventory_sources/$', GroupInventorySourcesList.as_view(), name='group_inventory_sources_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', GroupAdHocCommandsList.as_view(), name='group_ad_hoc_commands_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
HostList,
|
||||
@ -20,18 +20,18 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', HostList.as_view(), name='host_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', HostDetail.as_view(), name='host_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/variable_data/$', HostVariableData.as_view(), name='host_variable_data'),
|
||||
url(r'^(?P<pk>[0-9]+)/groups/$', HostGroupsList.as_view(), name='host_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/all_groups/$', HostAllGroupsList.as_view(), name='host_all_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_events/', HostJobEventsList.as_view(), name='host_job_events_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', HostJobHostSummariesList.as_view(), name='host_job_host_summaries_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', HostActivityStreamList.as_view(), name='host_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', HostInventorySourcesList.as_view(), name='host_inventory_sources_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/smart_inventories/$', HostSmartInventoriesList.as_view(), name='host_smart_inventories_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', HostAdHocCommandsList.as_view(), name='host_ad_hoc_commands_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/ad_hoc_command_events/$', HostAdHocCommandEventsList.as_view(), name='host_ad_hoc_command_events_list'),
|
||||
re_path(r'^$', HostList.as_view(), name='host_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', HostDetail.as_view(), name='host_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/variable_data/$', HostVariableData.as_view(), name='host_variable_data'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/groups/$', HostGroupsList.as_view(), name='host_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/all_groups/$', HostAllGroupsList.as_view(), name='host_all_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_events/', HostJobEventsList.as_view(), name='host_job_events_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_host_summaries/$', HostJobHostSummariesList.as_view(), name='host_job_host_summaries_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', HostActivityStreamList.as_view(), name='host_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventory_sources/$', HostInventorySourcesList.as_view(), name='host_inventory_sources_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/smart_inventories/$', HostSmartInventoriesList.as_view(), name='host_smart_inventories_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', HostAdHocCommandsList.as_view(), name='host_ad_hoc_commands_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/ad_hoc_command_events/$', HostAdHocCommandEventsList.as_view(), name='host_ad_hoc_command_events_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,17 +1,17 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import InstanceList, InstanceDetail, InstanceUnifiedJobsList, InstanceInstanceGroupsList, InstanceHealthCheck
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', InstanceList.as_view(), name='instance_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', InstanceDetail.as_view(), name='instance_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/jobs/$', InstanceUnifiedJobsList.as_view(), name='instance_unified_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/instance_groups/$', InstanceInstanceGroupsList.as_view(), name='instance_instance_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/health_check/$', InstanceHealthCheck.as_view(), name='instance_health_check'),
|
||||
re_path(r'^$', InstanceList.as_view(), name='instance_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', InstanceDetail.as_view(), name='instance_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/jobs/$', InstanceUnifiedJobsList.as_view(), name='instance_unified_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/instance_groups/$', InstanceInstanceGroupsList.as_view(), name='instance_instance_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/health_check/$', InstanceHealthCheck.as_view(), name='instance_health_check'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,16 +1,16 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import InstanceGroupList, InstanceGroupDetail, InstanceGroupUnifiedJobsList, InstanceGroupInstanceList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', InstanceGroupList.as_view(), name='instance_group_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', InstanceGroupDetail.as_view(), name='instance_group_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/jobs/$', InstanceGroupUnifiedJobsList.as_view(), name='instance_group_unified_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/instances/$', InstanceGroupInstanceList.as_view(), name='instance_group_instance_list'),
|
||||
re_path(r'^$', InstanceGroupList.as_view(), name='instance_group_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', InstanceGroupDetail.as_view(), name='instance_group_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/jobs/$', InstanceGroupUnifiedJobsList.as_view(), name='instance_group_unified_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/instances/$', InstanceGroupInstanceList.as_view(), name='instance_group_instance_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
InventoryList,
|
||||
@ -20,28 +20,30 @@ from awx.api.views import (
|
||||
InventoryAccessList,
|
||||
InventoryObjectRolesList,
|
||||
InventoryInstanceGroupsList,
|
||||
InventoryLabelList,
|
||||
InventoryCopy,
|
||||
)
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', InventoryList.as_view(), name='inventory_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', InventoryDetail.as_view(), name='inventory_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/hosts/$', InventoryHostsList.as_view(), name='inventory_hosts_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/groups/$', InventoryGroupsList.as_view(), name='inventory_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/root_groups/$', InventoryRootGroupsList.as_view(), name='inventory_root_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/variable_data/$', InventoryVariableData.as_view(), name='inventory_variable_data'),
|
||||
url(r'^(?P<pk>[0-9]+)/script/$', InventoryScriptView.as_view(), name='inventory_script_view'),
|
||||
url(r'^(?P<pk>[0-9]+)/tree/$', InventoryTreeView.as_view(), name='inventory_tree_view'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventory_sources/$', InventoryInventorySourcesList.as_view(), name='inventory_inventory_sources_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/update_inventory_sources/$', InventoryInventorySourcesUpdate.as_view(), name='inventory_inventory_sources_update'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', InventoryActivityStreamList.as_view(), name='inventory_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_templates/$', InventoryJobTemplateList.as_view(), name='inventory_job_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', InventoryAdHocCommandsList.as_view(), name='inventory_ad_hoc_commands_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', InventoryAccessList.as_view(), name='inventory_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', InventoryObjectRolesList.as_view(), name='inventory_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/instance_groups/$', InventoryInstanceGroupsList.as_view(), name='inventory_instance_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', InventoryCopy.as_view(), name='inventory_copy'),
|
||||
re_path(r'^$', InventoryList.as_view(), name='inventory_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', InventoryDetail.as_view(), name='inventory_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/hosts/$', InventoryHostsList.as_view(), name='inventory_hosts_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/groups/$', InventoryGroupsList.as_view(), name='inventory_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/root_groups/$', InventoryRootGroupsList.as_view(), name='inventory_root_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/variable_data/$', InventoryVariableData.as_view(), name='inventory_variable_data'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/script/$', InventoryScriptView.as_view(), name='inventory_script_view'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/tree/$', InventoryTreeView.as_view(), name='inventory_tree_view'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventory_sources/$', InventoryInventorySourcesList.as_view(), name='inventory_inventory_sources_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/update_inventory_sources/$', InventoryInventorySourcesUpdate.as_view(), name='inventory_inventory_sources_update'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', InventoryActivityStreamList.as_view(), name='inventory_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_templates/$', InventoryJobTemplateList.as_view(), name='inventory_job_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/ad_hoc_commands/$', InventoryAdHocCommandsList.as_view(), name='inventory_ad_hoc_commands_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', InventoryAccessList.as_view(), name='inventory_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', InventoryObjectRolesList.as_view(), name='inventory_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/instance_groups/$', InventoryInstanceGroupsList.as_view(), name='inventory_instance_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/labels/$', InventoryLabelList.as_view(), name='inventory_label_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', InventoryCopy.as_view(), name='inventory_copy'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
InventorySourceList,
|
||||
@ -20,26 +20,26 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', InventorySourceList.as_view(), name='inventory_source_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', InventorySourceDetail.as_view(), name='inventory_source_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/update/$', InventorySourceUpdateView.as_view(), name='inventory_source_update_view'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventory_updates/$', InventorySourceUpdatesList.as_view(), name='inventory_source_updates_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', InventorySourceActivityStreamList.as_view(), name='inventory_source_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/schedules/$', InventorySourceSchedulesList.as_view(), name='inventory_source_schedules_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', InventorySourceCredentialsList.as_view(), name='inventory_source_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/groups/$', InventorySourceGroupsList.as_view(), name='inventory_source_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/hosts/$', InventorySourceHostsList.as_view(), name='inventory_source_hosts_list'),
|
||||
url(
|
||||
re_path(r'^$', InventorySourceList.as_view(), name='inventory_source_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', InventorySourceDetail.as_view(), name='inventory_source_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/update/$', InventorySourceUpdateView.as_view(), name='inventory_source_update_view'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventory_updates/$', InventorySourceUpdatesList.as_view(), name='inventory_source_updates_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', InventorySourceActivityStreamList.as_view(), name='inventory_source_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/schedules/$', InventorySourceSchedulesList.as_view(), name='inventory_source_schedules_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', InventorySourceCredentialsList.as_view(), name='inventory_source_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/groups/$', InventorySourceGroupsList.as_view(), name='inventory_source_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/hosts/$', InventorySourceHostsList.as_view(), name='inventory_source_hosts_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
InventorySourceNotificationTemplatesStartedList.as_view(),
|
||||
name='inventory_source_notification_templates_started_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$',
|
||||
InventorySourceNotificationTemplatesErrorList.as_view(),
|
||||
name='inventory_source_notification_templates_error_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
InventorySourceNotificationTemplatesSuccessList.as_view(),
|
||||
name='inventory_source_notification_templates_success_list',
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
InventoryUpdateList,
|
||||
@ -15,13 +15,13 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', InventoryUpdateList.as_view(), name='inventory_update_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', InventoryUpdateDetail.as_view(), name='inventory_update_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', InventoryUpdateCancel.as_view(), name='inventory_update_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/stdout/$', InventoryUpdateStdout.as_view(), name='inventory_update_stdout'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', InventoryUpdateNotificationsList.as_view(), name='inventory_update_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', InventoryUpdateCredentialsList.as_view(), name='inventory_update_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/events/$', InventoryUpdateEventsList.as_view(), name='inventory_update_events_list'),
|
||||
re_path(r'^$', InventoryUpdateList.as_view(), name='inventory_update_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', InventoryUpdateDetail.as_view(), name='inventory_update_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', InventoryUpdateCancel.as_view(), name='inventory_update_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/stdout/$', InventoryUpdateStdout.as_view(), name='inventory_update_stdout'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', InventoryUpdateNotificationsList.as_view(), name='inventory_update_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', InventoryUpdateCredentialsList.as_view(), name='inventory_update_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/events/$', InventoryUpdateEventsList.as_view(), name='inventory_update_events_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
JobList,
|
||||
@ -10,6 +10,7 @@ from awx.api.views import (
|
||||
JobRelaunch,
|
||||
JobCreateSchedule,
|
||||
JobJobHostSummariesList,
|
||||
JobJobEventsChildrenSummary,
|
||||
JobJobEventsList,
|
||||
JobActivityStreamList,
|
||||
JobStdout,
|
||||
@ -20,18 +21,19 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', JobList.as_view(), name='job_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', JobDetail.as_view(), name='job_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', JobCancel.as_view(), name='job_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/relaunch/$', JobRelaunch.as_view(), name='job_relaunch'),
|
||||
url(r'^(?P<pk>[0-9]+)/create_schedule/$', JobCreateSchedule.as_view(), name='job_create_schedule'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_host_summaries/$', JobJobHostSummariesList.as_view(), name='job_job_host_summaries_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_events/$', JobJobEventsList.as_view(), name='job_job_events_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', JobActivityStreamList.as_view(), name='job_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/stdout/$', JobStdout.as_view(), name='job_stdout'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', JobNotificationsList.as_view(), name='job_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/labels/$', JobLabelList.as_view(), name='job_label_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail'),
|
||||
re_path(r'^$', JobList.as_view(), name='job_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', JobDetail.as_view(), name='job_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', JobCancel.as_view(), name='job_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/relaunch/$', JobRelaunch.as_view(), name='job_relaunch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/create_schedule/$', JobCreateSchedule.as_view(), name='job_create_schedule'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_host_summaries/$', JobJobHostSummariesList.as_view(), name='job_job_host_summaries_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_events/$', JobJobEventsList.as_view(), name='job_job_events_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_events/children_summary/$', JobJobEventsChildrenSummary.as_view(), name='job_job_events_children_summary'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', JobActivityStreamList.as_view(), name='job_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/stdout/$', JobStdout.as_view(), name='job_stdout'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', JobNotificationsList.as_view(), name='job_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/labels/$', JobLabelList.as_view(), name='job_label_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,13 +1,13 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import JobEventDetail, JobEventChildrenList
|
||||
|
||||
urls = [
|
||||
url(r'^(?P<pk>[0-9]+)/$', JobEventDetail.as_view(), name='job_event_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/children/$', JobEventChildrenList.as_view(), name='job_event_children_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', JobEventDetail.as_view(), name='job_event_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/children/$', JobEventChildrenList.as_view(), name='job_event_children_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,11 +1,11 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import JobHostSummaryDetail
|
||||
|
||||
|
||||
urls = [url(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail')]
|
||||
urls = [re_path(r'^(?P<pk>[0-9]+)/$', JobHostSummaryDetail.as_view(), name='job_host_summary_detail')]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import include, url
|
||||
from django.urls import include, re_path
|
||||
|
||||
from awx.api.views import (
|
||||
JobTemplateList,
|
||||
@ -25,36 +25,36 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', JobTemplateList.as_view(), name='job_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', JobTemplateDetail.as_view(), name='job_template_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/launch/$', JobTemplateLaunch.as_view(), name='job_template_launch'),
|
||||
url(r'^(?P<pk>[0-9]+)/jobs/$', JobTemplateJobsList.as_view(), name='job_template_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/slice_workflow_jobs/$', JobTemplateSliceWorkflowJobsList.as_view(), name='job_template_slice_workflow_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/callback/$', JobTemplateCallback.as_view(), name='job_template_callback'),
|
||||
url(r'^(?P<pk>[0-9]+)/schedules/$', JobTemplateSchedulesList.as_view(), name='job_template_schedules_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/survey_spec/$', JobTemplateSurveySpec.as_view(), name='job_template_survey_spec'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', JobTemplateActivityStreamList.as_view(), name='job_template_activity_stream_list'),
|
||||
url(
|
||||
re_path(r'^$', JobTemplateList.as_view(), name='job_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', JobTemplateDetail.as_view(), name='job_template_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/launch/$', JobTemplateLaunch.as_view(), name='job_template_launch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/jobs/$', JobTemplateJobsList.as_view(), name='job_template_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/slice_workflow_jobs/$', JobTemplateSliceWorkflowJobsList.as_view(), name='job_template_slice_workflow_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/callback/$', JobTemplateCallback.as_view(), name='job_template_callback'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/schedules/$', JobTemplateSchedulesList.as_view(), name='job_template_schedules_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/survey_spec/$', JobTemplateSurveySpec.as_view(), name='job_template_survey_spec'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', JobTemplateActivityStreamList.as_view(), name='job_template_activity_stream_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
JobTemplateNotificationTemplatesStartedList.as_view(),
|
||||
name='job_template_notification_templates_started_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$',
|
||||
JobTemplateNotificationTemplatesErrorList.as_view(),
|
||||
name='job_template_notification_templates_error_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
JobTemplateNotificationTemplatesSuccessList.as_view(),
|
||||
name='job_template_notification_templates_success_list',
|
||||
),
|
||||
url(r'^(?P<pk>[0-9]+)/instance_groups/$', JobTemplateInstanceGroupsList.as_view(), name='job_template_instance_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', JobTemplateAccessList.as_view(), name='job_template_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', JobTemplateObjectRolesList.as_view(), name='job_template_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/labels/$', JobTemplateLabelList.as_view(), name='job_template_label_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', JobTemplateCopy.as_view(), name='job_template_copy'),
|
||||
url(r'^(?P<pk>[0-9]+)/', include('awx.api.urls.webhooks'), {'model_kwarg': 'job_templates'}),
|
||||
re_path(r'^(?P<pk>[0-9]+)/instance_groups/$', JobTemplateInstanceGroupsList.as_view(), name='job_template_instance_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', JobTemplateAccessList.as_view(), name='job_template_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', JobTemplateObjectRolesList.as_view(), name='job_template_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/labels/$', JobTemplateLabelList.as_view(), name='job_template_label_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', JobTemplateCopy.as_view(), name='job_template_copy'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/', include('awx.api.urls.webhooks'), {'model_kwarg': 'job_templates'}),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,11 +1,11 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import LabelList, LabelDetail
|
||||
|
||||
|
||||
urls = [url(r'^$', LabelList.as_view(), name='label_list'), url(r'^(?P<pk>[0-9]+)/$', LabelDetail.as_view(), name='label_detail')]
|
||||
urls = [re_path(r'^$', LabelList.as_view(), name='label_list'), re_path(r'^(?P<pk>[0-9]+)/$', LabelDetail.as_view(), name='label_detail')]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,11 +1,14 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import NotificationList, NotificationDetail
|
||||
|
||||
|
||||
urls = [url(r'^$', NotificationList.as_view(), name='notification_list'), url(r'^(?P<pk>[0-9]+)/$', NotificationDetail.as_view(), name='notification_detail')]
|
||||
urls = [
|
||||
re_path(r'^$', NotificationList.as_view(), name='notification_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', NotificationDetail.as_view(), name='notification_detail'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
NotificationTemplateList,
|
||||
@ -13,11 +13,11 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', NotificationTemplateList.as_view(), name='notification_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', NotificationTemplateDetail.as_view(), name='notification_template_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/test/$', NotificationTemplateTest.as_view(), name='notification_template_test'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', NotificationTemplateNotificationList.as_view(), name='notification_template_notification_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', NotificationTemplateCopy.as_view(), name='notification_template_copy'),
|
||||
re_path(r'^$', NotificationTemplateList.as_view(), name='notification_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', NotificationTemplateDetail.as_view(), name='notification_template_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/test/$', NotificationTemplateTest.as_view(), name='notification_template_test'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', NotificationTemplateNotificationList.as_view(), name='notification_template_notification_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', NotificationTemplateCopy.as_view(), name='notification_template_copy'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
OAuth2ApplicationList,
|
||||
@ -15,13 +15,13 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
url(r'^applications/(?P<pk>[0-9]+)/$', OAuth2ApplicationDetail.as_view(), name='o_auth2_application_detail'),
|
||||
url(r'^applications/(?P<pk>[0-9]+)/tokens/$', ApplicationOAuth2TokenList.as_view(), name='o_auth2_application_token_list'),
|
||||
url(r'^applications/(?P<pk>[0-9]+)/activity_stream/$', OAuth2ApplicationActivityStreamList.as_view(), name='o_auth2_application_activity_stream_list'),
|
||||
url(r'^tokens/$', OAuth2TokenList.as_view(), name='o_auth2_token_list'),
|
||||
url(r'^tokens/(?P<pk>[0-9]+)/$', OAuth2TokenDetail.as_view(), name='o_auth2_token_detail'),
|
||||
url(r'^tokens/(?P<pk>[0-9]+)/activity_stream/$', OAuth2TokenActivityStreamList.as_view(), name='o_auth2_token_activity_stream_list'),
|
||||
re_path(r'^applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
re_path(r'^applications/(?P<pk>[0-9]+)/$', OAuth2ApplicationDetail.as_view(), name='o_auth2_application_detail'),
|
||||
re_path(r'^applications/(?P<pk>[0-9]+)/tokens/$', ApplicationOAuth2TokenList.as_view(), name='o_auth2_application_token_list'),
|
||||
re_path(r'^applications/(?P<pk>[0-9]+)/activity_stream/$', OAuth2ApplicationActivityStreamList.as_view(), name='o_auth2_application_activity_stream_list'),
|
||||
re_path(r'^tokens/$', OAuth2TokenList.as_view(), name='o_auth2_token_list'),
|
||||
re_path(r'^tokens/(?P<pk>[0-9]+)/$', OAuth2TokenDetail.as_view(), name='o_auth2_token_detail'),
|
||||
re_path(r'^tokens/(?P<pk>[0-9]+)/activity_stream/$', OAuth2TokenActivityStreamList.as_view(), name='o_auth2_token_activity_stream_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -4,7 +4,7 @@ from datetime import timedelta
|
||||
|
||||
from django.utils.timezone import now
|
||||
from django.conf import settings
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from oauthlib import oauth2
|
||||
from oauth2_provider import views
|
||||
@ -35,10 +35,10 @@ class TokenView(views.TokenView):
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ApiOAuthAuthorizationRootView.as_view(), name='oauth_authorization_root_view'),
|
||||
url(r"^authorize/$", views.AuthorizationView.as_view(), name="authorize"),
|
||||
url(r"^token/$", TokenView.as_view(), name="token"),
|
||||
url(r"^revoke_token/$", views.RevokeTokenView.as_view(), name="revoke-token"),
|
||||
re_path(r'^$', ApiOAuthAuthorizationRootView.as_view(), name='oauth_authorization_root_view'),
|
||||
re_path(r"^authorize/$", views.AuthorizationView.as_view(), name="authorize"),
|
||||
re_path(r"^token/$", TokenView.as_view(), name="token"),
|
||||
re_path(r"^revoke_token/$", views.RevokeTokenView.as_view(), name="revoke-token"),
|
||||
]
|
||||
|
||||
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
OrganizationList,
|
||||
@ -30,44 +30,44 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', OrganizationList.as_view(), name='organization_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', OrganizationDetail.as_view(), name='organization_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/users/$', OrganizationUsersList.as_view(), name='organization_users_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/admins/$', OrganizationAdminsList.as_view(), name='organization_admins_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventories/$', OrganizationInventoriesList.as_view(), name='organization_inventories_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/execution_environments/$', OrganizationExecutionEnvironmentsList.as_view(), name='organization_execution_environments_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/projects/$', OrganizationProjectsList.as_view(), name='organization_projects_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/job_templates/$', OrganizationJobTemplatesList.as_view(), name='organization_job_templates_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/workflow_job_templates/$', OrganizationWorkflowJobTemplatesList.as_view(), name='organization_workflow_job_templates_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/teams/$', OrganizationTeamsList.as_view(), name='organization_teams_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', OrganizationCredentialList.as_view(), name='organization_credential_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', OrganizationActivityStreamList.as_view(), name='organization_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/notification_templates/$', OrganizationNotificationTemplatesList.as_view(), name='organization_notification_templates_list'),
|
||||
url(
|
||||
re_path(r'^$', OrganizationList.as_view(), name='organization_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', OrganizationDetail.as_view(), name='organization_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/users/$', OrganizationUsersList.as_view(), name='organization_users_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/admins/$', OrganizationAdminsList.as_view(), name='organization_admins_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventories/$', OrganizationInventoriesList.as_view(), name='organization_inventories_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/execution_environments/$', OrganizationExecutionEnvironmentsList.as_view(), name='organization_execution_environments_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/projects/$', OrganizationProjectsList.as_view(), name='organization_projects_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/job_templates/$', OrganizationJobTemplatesList.as_view(), name='organization_job_templates_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/workflow_job_templates/$', OrganizationWorkflowJobTemplatesList.as_view(), name='organization_workflow_job_templates_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/teams/$', OrganizationTeamsList.as_view(), name='organization_teams_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', OrganizationCredentialList.as_view(), name='organization_credential_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', OrganizationActivityStreamList.as_view(), name='organization_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notification_templates/$', OrganizationNotificationTemplatesList.as_view(), name='organization_notification_templates_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
OrganizationNotificationTemplatesStartedList.as_view(),
|
||||
name='organization_notification_templates_started_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$',
|
||||
OrganizationNotificationTemplatesErrorList.as_view(),
|
||||
name='organization_notification_templates_error_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
OrganizationNotificationTemplatesSuccessList.as_view(),
|
||||
name='organization_notification_templates_success_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_approvals/$',
|
||||
OrganizationNotificationTemplatesApprovalList.as_view(),
|
||||
name='organization_notification_templates_approvals_list',
|
||||
),
|
||||
url(r'^(?P<pk>[0-9]+)/instance_groups/$', OrganizationInstanceGroupsList.as_view(), name='organization_instance_groups_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/galaxy_credentials/$', OrganizationGalaxyCredentialsList.as_view(), name='organization_galaxy_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', OrganizationObjectRolesList.as_view(), name='organization_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', OrganizationAccessList.as_view(), name='organization_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/applications/$', OrganizationApplicationList.as_view(), name='organization_applications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/instance_groups/$', OrganizationInstanceGroupsList.as_view(), name='organization_instance_groups_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/galaxy_credentials/$', OrganizationGalaxyCredentialsList.as_view(), name='organization_galaxy_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', OrganizationObjectRolesList.as_view(), name='organization_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', OrganizationAccessList.as_view(), name='organization_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/applications/$', OrganizationApplicationList.as_view(), name='organization_applications_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
ProjectList,
|
||||
@ -24,30 +24,32 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ProjectList.as_view(), name='project_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', ProjectDetail.as_view(), name='project_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/playbooks/$', ProjectPlaybooks.as_view(), name='project_playbooks'),
|
||||
url(r'^(?P<pk>[0-9]+)/inventories/$', ProjectInventories.as_view(), name='project_inventories'),
|
||||
url(r'^(?P<pk>[0-9]+)/scm_inventory_sources/$', ProjectScmInventorySources.as_view(), name='project_scm_inventory_sources'),
|
||||
url(r'^(?P<pk>[0-9]+)/teams/$', ProjectTeamsList.as_view(), name='project_teams_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/update/$', ProjectUpdateView.as_view(), name='project_update_view'),
|
||||
url(r'^(?P<pk>[0-9]+)/project_updates/$', ProjectUpdatesList.as_view(), name='project_updates_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', ProjectActivityStreamList.as_view(), name='project_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/schedules/$', ProjectSchedulesList.as_view(), name='project_schedules_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/notification_templates_error/$', ProjectNotificationTemplatesErrorList.as_view(), name='project_notification_templates_error_list'),
|
||||
url(
|
||||
re_path(r'^$', ProjectList.as_view(), name='project_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', ProjectDetail.as_view(), name='project_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/playbooks/$', ProjectPlaybooks.as_view(), name='project_playbooks'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/inventories/$', ProjectInventories.as_view(), name='project_inventories'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/scm_inventory_sources/$', ProjectScmInventorySources.as_view(), name='project_scm_inventory_sources'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/teams/$', ProjectTeamsList.as_view(), name='project_teams_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/update/$', ProjectUpdateView.as_view(), name='project_update_view'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/project_updates/$', ProjectUpdatesList.as_view(), name='project_updates_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', ProjectActivityStreamList.as_view(), name='project_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/schedules/$', ProjectSchedulesList.as_view(), name='project_schedules_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$', ProjectNotificationTemplatesErrorList.as_view(), name='project_notification_templates_error_list'
|
||||
),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
ProjectNotificationTemplatesSuccessList.as_view(),
|
||||
name='project_notification_templates_success_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
ProjectNotificationTemplatesStartedList.as_view(),
|
||||
name='project_notification_templates_started_list',
|
||||
),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', ProjectObjectRolesList.as_view(), name='project_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', ProjectAccessList.as_view(), name='project_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', ProjectCopy.as_view(), name='project_copy'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', ProjectObjectRolesList.as_view(), name='project_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', ProjectAccessList.as_view(), name='project_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', ProjectCopy.as_view(), name='project_copy'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
ProjectUpdateList,
|
||||
@ -15,13 +15,13 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ProjectUpdateList.as_view(), name='project_update_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', ProjectUpdateDetail.as_view(), name='project_update_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', ProjectUpdateCancel.as_view(), name='project_update_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/stdout/$', ProjectUpdateStdout.as_view(), name='project_update_stdout'),
|
||||
url(r'^(?P<pk>[0-9]+)/scm_inventory_updates/$', ProjectUpdateScmInventoryUpdates.as_view(), name='project_update_scm_inventory_updates'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', ProjectUpdateNotificationsList.as_view(), name='project_update_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/events/$', ProjectUpdateEventsList.as_view(), name='project_update_events_list'),
|
||||
re_path(r'^$', ProjectUpdateList.as_view(), name='project_update_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', ProjectUpdateDetail.as_view(), name='project_update_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', ProjectUpdateCancel.as_view(), name='project_update_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/stdout/$', ProjectUpdateStdout.as_view(), name='project_update_stdout'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/scm_inventory_updates/$', ProjectUpdateScmInventoryUpdates.as_view(), name='project_update_scm_inventory_updates'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', ProjectUpdateNotificationsList.as_view(), name='project_update_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/events/$', ProjectUpdateEventsList.as_view(), name='project_update_events_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,18 +1,18 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import RoleList, RoleDetail, RoleUsersList, RoleTeamsList, RoleParentsList, RoleChildrenList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', RoleList.as_view(), name='role_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', RoleDetail.as_view(), name='role_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/users/$', RoleUsersList.as_view(), name='role_users_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/teams/$', RoleTeamsList.as_view(), name='role_teams_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/parents/$', RoleParentsList.as_view(), name='role_parents_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/children/$', RoleChildrenList.as_view(), name='role_children_list'),
|
||||
re_path(r'^$', RoleList.as_view(), name='role_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', RoleDetail.as_view(), name='role_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/users/$', RoleUsersList.as_view(), name='role_users_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/teams/$', RoleTeamsList.as_view(), name='role_teams_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/parents/$', RoleParentsList.as_view(), name='role_parents_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/children/$', RoleChildrenList.as_view(), name='role_children_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,16 +1,16 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import ScheduleList, ScheduleDetail, ScheduleUnifiedJobsList, ScheduleCredentialsList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', ScheduleList.as_view(), name='schedule_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', ScheduleDetail.as_view(), name='schedule_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/jobs/$', ScheduleUnifiedJobsList.as_view(), name='schedule_unified_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', ScheduleCredentialsList.as_view(), name='schedule_credentials_list'),
|
||||
re_path(r'^$', ScheduleList.as_view(), name='schedule_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', ScheduleDetail.as_view(), name='schedule_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/jobs/$', ScheduleUnifiedJobsList.as_view(), name='schedule_unified_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', ScheduleCredentialsList.as_view(), name='schedule_credentials_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,17 +1,17 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import SystemJobList, SystemJobDetail, SystemJobCancel, SystemJobNotificationsList, SystemJobEventsList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', SystemJobList.as_view(), name='system_job_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', SystemJobDetail.as_view(), name='system_job_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', SystemJobCancel.as_view(), name='system_job_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', SystemJobNotificationsList.as_view(), name='system_job_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/events/$', SystemJobEventsList.as_view(), name='system_job_events_list'),
|
||||
re_path(r'^$', SystemJobList.as_view(), name='system_job_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', SystemJobDetail.as_view(), name='system_job_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', SystemJobCancel.as_view(), name='system_job_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', SystemJobNotificationsList.as_view(), name='system_job_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/events/$', SystemJobEventsList.as_view(), name='system_job_events_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
SystemJobTemplateList,
|
||||
@ -16,22 +16,22 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', SystemJobTemplateList.as_view(), name='system_job_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', SystemJobTemplateDetail.as_view(), name='system_job_template_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/launch/$', SystemJobTemplateLaunch.as_view(), name='system_job_template_launch'),
|
||||
url(r'^(?P<pk>[0-9]+)/jobs/$', SystemJobTemplateJobsList.as_view(), name='system_job_template_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/schedules/$', SystemJobTemplateSchedulesList.as_view(), name='system_job_template_schedules_list'),
|
||||
url(
|
||||
re_path(r'^$', SystemJobTemplateList.as_view(), name='system_job_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', SystemJobTemplateDetail.as_view(), name='system_job_template_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/launch/$', SystemJobTemplateLaunch.as_view(), name='system_job_template_launch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/jobs/$', SystemJobTemplateJobsList.as_view(), name='system_job_template_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/schedules/$', SystemJobTemplateSchedulesList.as_view(), name='system_job_template_schedules_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
SystemJobTemplateNotificationTemplatesStartedList.as_view(),
|
||||
name='system_job_template_notification_templates_started_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$',
|
||||
SystemJobTemplateNotificationTemplatesErrorList.as_view(),
|
||||
name='system_job_template_notification_templates_error_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
SystemJobTemplateNotificationTemplatesSuccessList.as_view(),
|
||||
name='system_job_template_notification_templates_success_list',
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
TeamList,
|
||||
@ -17,15 +17,15 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', TeamList.as_view(), name='team_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', TeamDetail.as_view(), name='team_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/projects/$', TeamProjectsList.as_view(), name='team_projects_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/users/$', TeamUsersList.as_view(), name='team_users_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', TeamCredentialsList.as_view(), name='team_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/roles/$', TeamRolesList.as_view(), name='team_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', TeamObjectRolesList.as_view(), name='team_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', TeamActivityStreamList.as_view(), name='team_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', TeamAccessList.as_view(), name='team_access_list'),
|
||||
re_path(r'^$', TeamList.as_view(), name='team_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', TeamDetail.as_view(), name='team_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/projects/$', TeamProjectsList.as_view(), name='team_projects_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/users/$', TeamUsersList.as_view(), name='team_users_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', TeamCredentialsList.as_view(), name='team_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/roles/$', TeamRolesList.as_view(), name='team_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', TeamObjectRolesList.as_view(), name='team_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', TeamActivityStreamList.as_view(), name='team_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', TeamAccessList.as_view(), name='team_access_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -3,7 +3,7 @@
|
||||
|
||||
from __future__ import absolute_import, unicode_literals
|
||||
from django.conf import settings
|
||||
from django.conf.urls import include, url
|
||||
from django.urls import include, re_path
|
||||
|
||||
from awx.api.generics import LoggedLoginView, LoggedLogoutView
|
||||
from awx.api.views import (
|
||||
@ -28,6 +28,7 @@ from awx.api.views import (
|
||||
OAuth2TokenList,
|
||||
ApplicationOAuth2TokenList,
|
||||
OAuth2ApplicationDetail,
|
||||
MeshVisualizer,
|
||||
)
|
||||
|
||||
from awx.api.views.metrics import MetricsView
|
||||
@ -73,77 +74,78 @@ from .workflow_approval import urls as workflow_approval_urls
|
||||
|
||||
|
||||
v2_urls = [
|
||||
url(r'^$', ApiV2RootView.as_view(), name='api_v2_root_view'),
|
||||
url(r'^credential_types/', include(credential_type_urls)),
|
||||
url(r'^credential_input_sources/', include(credential_input_source_urls)),
|
||||
url(r'^hosts/(?P<pk>[0-9]+)/ansible_facts/$', HostAnsibleFactsDetail.as_view(), name='host_ansible_facts_detail'),
|
||||
url(r'^jobs/(?P<pk>[0-9]+)/credentials/$', JobCredentialsList.as_view(), name='job_credentials_list'),
|
||||
url(r'^job_templates/(?P<pk>[0-9]+)/credentials/$', JobTemplateCredentialsList.as_view(), name='job_template_credentials_list'),
|
||||
url(r'^schedules/preview/$', SchedulePreview.as_view(), name='schedule_rrule'),
|
||||
url(r'^schedules/zoneinfo/$', ScheduleZoneInfo.as_view(), name='schedule_zoneinfo'),
|
||||
url(r'^applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
url(r'^applications/(?P<pk>[0-9]+)/$', OAuth2ApplicationDetail.as_view(), name='o_auth2_application_detail'),
|
||||
url(r'^applications/(?P<pk>[0-9]+)/tokens/$', ApplicationOAuth2TokenList.as_view(), name='application_o_auth2_token_list'),
|
||||
url(r'^tokens/$', OAuth2TokenList.as_view(), name='o_auth2_token_list'),
|
||||
url(r'^', include(oauth2_urls)),
|
||||
url(r'^metrics/$', MetricsView.as_view(), name='metrics_view'),
|
||||
url(r'^ping/$', ApiV2PingView.as_view(), name='api_v2_ping_view'),
|
||||
url(r'^config/$', ApiV2ConfigView.as_view(), name='api_v2_config_view'),
|
||||
url(r'^config/subscriptions/$', ApiV2SubscriptionView.as_view(), name='api_v2_subscription_view'),
|
||||
url(r'^config/attach/$', ApiV2AttachView.as_view(), name='api_v2_attach_view'),
|
||||
url(r'^auth/$', AuthView.as_view()),
|
||||
url(r'^me/$', UserMeList.as_view(), name='user_me_list'),
|
||||
url(r'^dashboard/$', DashboardView.as_view(), name='dashboard_view'),
|
||||
url(r'^dashboard/graphs/jobs/$', DashboardJobsGraphView.as_view(), name='dashboard_jobs_graph_view'),
|
||||
url(r'^settings/', include('awx.conf.urls')),
|
||||
url(r'^instances/', include(instance_urls)),
|
||||
url(r'^instance_groups/', include(instance_group_urls)),
|
||||
url(r'^schedules/', include(schedule_urls)),
|
||||
url(r'^organizations/', include(organization_urls)),
|
||||
url(r'^users/', include(user_urls)),
|
||||
url(r'^execution_environments/', include(execution_environment_urls)),
|
||||
url(r'^projects/', include(project_urls)),
|
||||
url(r'^project_updates/', include(project_update_urls)),
|
||||
url(r'^teams/', include(team_urls)),
|
||||
url(r'^inventories/', include(inventory_urls)),
|
||||
url(r'^hosts/', include(host_urls)),
|
||||
url(r'^groups/', include(group_urls)),
|
||||
url(r'^inventory_sources/', include(inventory_source_urls)),
|
||||
url(r'^inventory_updates/', include(inventory_update_urls)),
|
||||
url(r'^credentials/', include(credential_urls)),
|
||||
url(r'^roles/', include(role_urls)),
|
||||
url(r'^job_templates/', include(job_template_urls)),
|
||||
url(r'^jobs/', include(job_urls)),
|
||||
url(r'^job_host_summaries/', include(job_host_summary_urls)),
|
||||
url(r'^job_events/', include(job_event_urls)),
|
||||
url(r'^ad_hoc_commands/', include(ad_hoc_command_urls)),
|
||||
url(r'^ad_hoc_command_events/', include(ad_hoc_command_event_urls)),
|
||||
url(r'^system_job_templates/', include(system_job_template_urls)),
|
||||
url(r'^system_jobs/', include(system_job_urls)),
|
||||
url(r'^notification_templates/', include(notification_template_urls)),
|
||||
url(r'^notifications/', include(notification_urls)),
|
||||
url(r'^workflow_job_templates/', include(workflow_job_template_urls)),
|
||||
url(r'^workflow_jobs/', include(workflow_job_urls)),
|
||||
url(r'^labels/', include(label_urls)),
|
||||
url(r'^workflow_job_template_nodes/', include(workflow_job_template_node_urls)),
|
||||
url(r'^workflow_job_nodes/', include(workflow_job_node_urls)),
|
||||
url(r'^unified_job_templates/$', UnifiedJobTemplateList.as_view(), name='unified_job_template_list'),
|
||||
url(r'^unified_jobs/$', UnifiedJobList.as_view(), name='unified_job_list'),
|
||||
url(r'^activity_stream/', include(activity_stream_urls)),
|
||||
url(r'^workflow_approval_templates/', include(workflow_approval_template_urls)),
|
||||
url(r'^workflow_approvals/', include(workflow_approval_urls)),
|
||||
re_path(r'^$', ApiV2RootView.as_view(), name='api_v2_root_view'),
|
||||
re_path(r'^credential_types/', include(credential_type_urls)),
|
||||
re_path(r'^credential_input_sources/', include(credential_input_source_urls)),
|
||||
re_path(r'^hosts/(?P<pk>[0-9]+)/ansible_facts/$', HostAnsibleFactsDetail.as_view(), name='host_ansible_facts_detail'),
|
||||
re_path(r'^jobs/(?P<pk>[0-9]+)/credentials/$', JobCredentialsList.as_view(), name='job_credentials_list'),
|
||||
re_path(r'^job_templates/(?P<pk>[0-9]+)/credentials/$', JobTemplateCredentialsList.as_view(), name='job_template_credentials_list'),
|
||||
re_path(r'^schedules/preview/$', SchedulePreview.as_view(), name='schedule_rrule'),
|
||||
re_path(r'^schedules/zoneinfo/$', ScheduleZoneInfo.as_view(), name='schedule_zoneinfo'),
|
||||
re_path(r'^applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
re_path(r'^applications/(?P<pk>[0-9]+)/$', OAuth2ApplicationDetail.as_view(), name='o_auth2_application_detail'),
|
||||
re_path(r'^applications/(?P<pk>[0-9]+)/tokens/$', ApplicationOAuth2TokenList.as_view(), name='application_o_auth2_token_list'),
|
||||
re_path(r'^tokens/$', OAuth2TokenList.as_view(), name='o_auth2_token_list'),
|
||||
re_path(r'^', include(oauth2_urls)),
|
||||
re_path(r'^metrics/$', MetricsView.as_view(), name='metrics_view'),
|
||||
re_path(r'^ping/$', ApiV2PingView.as_view(), name='api_v2_ping_view'),
|
||||
re_path(r'^config/$', ApiV2ConfigView.as_view(), name='api_v2_config_view'),
|
||||
re_path(r'^config/subscriptions/$', ApiV2SubscriptionView.as_view(), name='api_v2_subscription_view'),
|
||||
re_path(r'^config/attach/$', ApiV2AttachView.as_view(), name='api_v2_attach_view'),
|
||||
re_path(r'^auth/$', AuthView.as_view()),
|
||||
re_path(r'^me/$', UserMeList.as_view(), name='user_me_list'),
|
||||
re_path(r'^dashboard/$', DashboardView.as_view(), name='dashboard_view'),
|
||||
re_path(r'^dashboard/graphs/jobs/$', DashboardJobsGraphView.as_view(), name='dashboard_jobs_graph_view'),
|
||||
re_path(r'^mesh_visualizer/', MeshVisualizer.as_view(), name='mesh_visualizer_view'),
|
||||
re_path(r'^settings/', include('awx.conf.urls')),
|
||||
re_path(r'^instances/', include(instance_urls)),
|
||||
re_path(r'^instance_groups/', include(instance_group_urls)),
|
||||
re_path(r'^schedules/', include(schedule_urls)),
|
||||
re_path(r'^organizations/', include(organization_urls)),
|
||||
re_path(r'^users/', include(user_urls)),
|
||||
re_path(r'^execution_environments/', include(execution_environment_urls)),
|
||||
re_path(r'^projects/', include(project_urls)),
|
||||
re_path(r'^project_updates/', include(project_update_urls)),
|
||||
re_path(r'^teams/', include(team_urls)),
|
||||
re_path(r'^inventories/', include(inventory_urls)),
|
||||
re_path(r'^hosts/', include(host_urls)),
|
||||
re_path(r'^groups/', include(group_urls)),
|
||||
re_path(r'^inventory_sources/', include(inventory_source_urls)),
|
||||
re_path(r'^inventory_updates/', include(inventory_update_urls)),
|
||||
re_path(r'^credentials/', include(credential_urls)),
|
||||
re_path(r'^roles/', include(role_urls)),
|
||||
re_path(r'^job_templates/', include(job_template_urls)),
|
||||
re_path(r'^jobs/', include(job_urls)),
|
||||
re_path(r'^job_host_summaries/', include(job_host_summary_urls)),
|
||||
re_path(r'^job_events/', include(job_event_urls)),
|
||||
re_path(r'^ad_hoc_commands/', include(ad_hoc_command_urls)),
|
||||
re_path(r'^ad_hoc_command_events/', include(ad_hoc_command_event_urls)),
|
||||
re_path(r'^system_job_templates/', include(system_job_template_urls)),
|
||||
re_path(r'^system_jobs/', include(system_job_urls)),
|
||||
re_path(r'^notification_templates/', include(notification_template_urls)),
|
||||
re_path(r'^notifications/', include(notification_urls)),
|
||||
re_path(r'^workflow_job_templates/', include(workflow_job_template_urls)),
|
||||
re_path(r'^workflow_jobs/', include(workflow_job_urls)),
|
||||
re_path(r'^labels/', include(label_urls)),
|
||||
re_path(r'^workflow_job_template_nodes/', include(workflow_job_template_node_urls)),
|
||||
re_path(r'^workflow_job_nodes/', include(workflow_job_node_urls)),
|
||||
re_path(r'^unified_job_templates/$', UnifiedJobTemplateList.as_view(), name='unified_job_template_list'),
|
||||
re_path(r'^unified_jobs/$', UnifiedJobList.as_view(), name='unified_job_list'),
|
||||
re_path(r'^activity_stream/', include(activity_stream_urls)),
|
||||
re_path(r'^workflow_approval_templates/', include(workflow_approval_template_urls)),
|
||||
re_path(r'^workflow_approvals/', include(workflow_approval_urls)),
|
||||
]
|
||||
|
||||
|
||||
app_name = 'api'
|
||||
urlpatterns = [
|
||||
url(r'^$', ApiRootView.as_view(), name='api_root_view'),
|
||||
url(r'^(?P<version>(v2))/', include(v2_urls)),
|
||||
url(r'^login/$', LoggedLoginView.as_view(template_name='rest_framework/login.html', extra_context={'inside_login_context': True}), name='login'),
|
||||
url(r'^logout/$', LoggedLogoutView.as_view(next_page='/api/', redirect_field_name='next'), name='logout'),
|
||||
url(r'^o/', include(oauth2_root_urls)),
|
||||
re_path(r'^$', ApiRootView.as_view(), name='api_root_view'),
|
||||
re_path(r'^(?P<version>(v2))/', include(v2_urls)),
|
||||
re_path(r'^login/$', LoggedLoginView.as_view(template_name='rest_framework/login.html', extra_context={'inside_login_context': True}), name='login'),
|
||||
re_path(r'^logout/$', LoggedLogoutView.as_view(next_page='/api/', redirect_field_name='next'), name='logout'),
|
||||
re_path(r'^o/', include(oauth2_root_urls)),
|
||||
]
|
||||
if settings.SETTINGS_MODULE == 'awx.settings.development':
|
||||
from awx.api.swagger import SwaggerSchemaView
|
||||
|
||||
urlpatterns += [url(r'^swagger/$', SwaggerSchemaView.as_view(), name='swagger_view')]
|
||||
urlpatterns += [re_path(r'^swagger/$', SwaggerSchemaView.as_view(), name='swagger_view')]
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
UserList,
|
||||
@ -21,20 +21,20 @@ from awx.api.views import (
|
||||
)
|
||||
|
||||
urls = [
|
||||
url(r'^$', UserList.as_view(), name='user_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', UserDetail.as_view(), name='user_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/teams/$', UserTeamsList.as_view(), name='user_teams_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/organizations/$', UserOrganizationsList.as_view(), name='user_organizations_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/admin_of_organizations/$', UserAdminOfOrganizationsList.as_view(), name='user_admin_of_organizations_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/projects/$', UserProjectsList.as_view(), name='user_projects_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', UserCredentialsList.as_view(), name='user_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/roles/$', UserRolesList.as_view(), name='user_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', UserActivityStreamList.as_view(), name='user_activity_stream_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', UserAccessList.as_view(), name='user_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/tokens/$', OAuth2UserTokenList.as_view(), name='o_auth2_token_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/authorized_tokens/$', UserAuthorizedTokenList.as_view(), name='user_authorized_token_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/personal_tokens/$', UserPersonalTokenList.as_view(), name='user_personal_token_list'),
|
||||
re_path(r'^$', UserList.as_view(), name='user_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', UserDetail.as_view(), name='user_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/teams/$', UserTeamsList.as_view(), name='user_teams_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/organizations/$', UserOrganizationsList.as_view(), name='user_organizations_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/admin_of_organizations/$', UserAdminOfOrganizationsList.as_view(), name='user_admin_of_organizations_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/projects/$', UserProjectsList.as_view(), name='user_projects_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', UserCredentialsList.as_view(), name='user_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/roles/$', UserRolesList.as_view(), name='user_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', UserActivityStreamList.as_view(), name='user_activity_stream_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', UserAccessList.as_view(), name='user_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/applications/$', OAuth2ApplicationList.as_view(), name='o_auth2_application_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/tokens/$', OAuth2UserTokenList.as_view(), name='o_auth2_token_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/authorized_tokens/$', UserAuthorizedTokenList.as_view(), name='user_authorized_token_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/personal_tokens/$', UserPersonalTokenList.as_view(), name='user_personal_token_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,10 +1,10 @@
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import WebhookKeyView, GithubWebhookReceiver, GitlabWebhookReceiver
|
||||
|
||||
|
||||
urlpatterns = [
|
||||
url(r'^webhook_key/$', WebhookKeyView.as_view(), name='webhook_key'),
|
||||
url(r'^github/$', GithubWebhookReceiver.as_view(), name='webhook_receiver_github'),
|
||||
url(r'^gitlab/$', GitlabWebhookReceiver.as_view(), name='webhook_receiver_gitlab'),
|
||||
re_path(r'^webhook_key/$', WebhookKeyView.as_view(), name='webhook_key'),
|
||||
re_path(r'^github/$', GithubWebhookReceiver.as_view(), name='webhook_receiver_github'),
|
||||
re_path(r'^gitlab/$', GitlabWebhookReceiver.as_view(), name='webhook_receiver_gitlab'),
|
||||
]
|
||||
|
||||
@ -1,16 +1,16 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import WorkflowApprovalList, WorkflowApprovalDetail, WorkflowApprovalApprove, WorkflowApprovalDeny
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', WorkflowApprovalList.as_view(), name='workflow_approval_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalDetail.as_view(), name='workflow_approval_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/approve/$', WorkflowApprovalApprove.as_view(), name='workflow_approval_approve'),
|
||||
url(r'^(?P<pk>[0-9]+)/deny/$', WorkflowApprovalDeny.as_view(), name='workflow_approval_deny'),
|
||||
re_path(r'^$', WorkflowApprovalList.as_view(), name='workflow_approval_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalDetail.as_view(), name='workflow_approval_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/approve/$', WorkflowApprovalApprove.as_view(), name='workflow_approval_approve'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/deny/$', WorkflowApprovalDeny.as_view(), name='workflow_approval_deny'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,14 +1,14 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import WorkflowApprovalTemplateDetail, WorkflowApprovalTemplateJobsList
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalTemplateDetail.as_view(), name='workflow_approval_template_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/approvals/$', WorkflowApprovalTemplateJobsList.as_view(), name='workflow_approval_template_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowApprovalTemplateDetail.as_view(), name='workflow_approval_template_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/approvals/$', WorkflowApprovalTemplateJobsList.as_view(), name='workflow_approval_template_jobs_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
WorkflowJobList,
|
||||
@ -16,14 +16,14 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', WorkflowJobList.as_view(), name='workflow_job_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobDetail.as_view(), name='workflow_job_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobWorkflowNodesList.as_view(), name='workflow_job_workflow_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobLabelList.as_view(), name='workflow_job_label_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/cancel/$', WorkflowJobCancel.as_view(), name='workflow_job_cancel'),
|
||||
url(r'^(?P<pk>[0-9]+)/relaunch/$', WorkflowJobRelaunch.as_view(), name='workflow_job_relaunch'),
|
||||
url(r'^(?P<pk>[0-9]+)/notifications/$', WorkflowJobNotificationsList.as_view(), name='workflow_job_notifications_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobActivityStreamList.as_view(), name='workflow_job_activity_stream_list'),
|
||||
re_path(r'^$', WorkflowJobList.as_view(), name='workflow_job_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowJobDetail.as_view(), name='workflow_job_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobWorkflowNodesList.as_view(), name='workflow_job_workflow_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobLabelList.as_view(), name='workflow_job_label_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/cancel/$', WorkflowJobCancel.as_view(), name='workflow_job_cancel'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/relaunch/$', WorkflowJobRelaunch.as_view(), name='workflow_job_relaunch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/notifications/$', WorkflowJobNotificationsList.as_view(), name='workflow_job_notifications_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobActivityStreamList.as_view(), name='workflow_job_activity_stream_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
WorkflowJobNodeList,
|
||||
@ -14,12 +14,12 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', WorkflowJobNodeList.as_view(), name='workflow_job_node_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobNodeDetail.as_view(), name='workflow_job_node_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobNodeSuccessNodesList.as_view(), name='workflow_job_node_success_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobNodeFailureNodesList.as_view(), name='workflow_job_node_failure_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobNodeAlwaysNodesList.as_view(), name='workflow_job_node_always_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobNodeCredentialsList.as_view(), name='workflow_job_node_credentials_list'),
|
||||
re_path(r'^$', WorkflowJobNodeList.as_view(), name='workflow_job_node_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowJobNodeDetail.as_view(), name='workflow_job_node_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobNodeSuccessNodesList.as_view(), name='workflow_job_node_success_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobNodeFailureNodesList.as_view(), name='workflow_job_node_failure_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobNodeAlwaysNodesList.as_view(), name='workflow_job_node_always_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobNodeCredentialsList.as_view(), name='workflow_job_node_credentials_list'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import include, url
|
||||
from django.urls import include, re_path
|
||||
|
||||
from awx.api.views import (
|
||||
WorkflowJobTemplateList,
|
||||
@ -24,39 +24,39 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', WorkflowJobTemplateList.as_view(), name='workflow_job_template_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateDetail.as_view(), name='workflow_job_template_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/workflow_jobs/$', WorkflowJobTemplateJobsList.as_view(), name='workflow_job_template_jobs_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/launch/$', WorkflowJobTemplateLaunch.as_view(), name='workflow_job_template_launch'),
|
||||
url(r'^(?P<pk>[0-9]+)/copy/$', WorkflowJobTemplateCopy.as_view(), name='workflow_job_template_copy'),
|
||||
url(r'^(?P<pk>[0-9]+)/schedules/$', WorkflowJobTemplateSchedulesList.as_view(), name='workflow_job_template_schedules_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/survey_spec/$', WorkflowJobTemplateSurveySpec.as_view(), name='workflow_job_template_survey_spec'),
|
||||
url(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobTemplateWorkflowNodesList.as_view(), name='workflow_job_template_workflow_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobTemplateActivityStreamList.as_view(), name='workflow_job_template_activity_stream_list'),
|
||||
url(
|
||||
re_path(r'^$', WorkflowJobTemplateList.as_view(), name='workflow_job_template_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateDetail.as_view(), name='workflow_job_template_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/workflow_jobs/$', WorkflowJobTemplateJobsList.as_view(), name='workflow_job_template_jobs_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/launch/$', WorkflowJobTemplateLaunch.as_view(), name='workflow_job_template_launch'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/copy/$', WorkflowJobTemplateCopy.as_view(), name='workflow_job_template_copy'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/schedules/$', WorkflowJobTemplateSchedulesList.as_view(), name='workflow_job_template_schedules_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/survey_spec/$', WorkflowJobTemplateSurveySpec.as_view(), name='workflow_job_template_survey_spec'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/workflow_nodes/$', WorkflowJobTemplateWorkflowNodesList.as_view(), name='workflow_job_template_workflow_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/activity_stream/$', WorkflowJobTemplateActivityStreamList.as_view(), name='workflow_job_template_activity_stream_list'),
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_started/$',
|
||||
WorkflowJobTemplateNotificationTemplatesStartedList.as_view(),
|
||||
name='workflow_job_template_notification_templates_started_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_error/$',
|
||||
WorkflowJobTemplateNotificationTemplatesErrorList.as_view(),
|
||||
name='workflow_job_template_notification_templates_error_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_success/$',
|
||||
WorkflowJobTemplateNotificationTemplatesSuccessList.as_view(),
|
||||
name='workflow_job_template_notification_templates_success_list',
|
||||
),
|
||||
url(
|
||||
re_path(
|
||||
r'^(?P<pk>[0-9]+)/notification_templates_approvals/$',
|
||||
WorkflowJobTemplateNotificationTemplatesApprovalList.as_view(),
|
||||
name='workflow_job_template_notification_templates_approvals_list',
|
||||
),
|
||||
url(r'^(?P<pk>[0-9]+)/access_list/$', WorkflowJobTemplateAccessList.as_view(), name='workflow_job_template_access_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/object_roles/$', WorkflowJobTemplateObjectRolesList.as_view(), name='workflow_job_template_object_roles_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobTemplateLabelList.as_view(), name='workflow_job_template_label_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/', include('awx.api.urls.webhooks'), {'model_kwarg': 'workflow_job_templates'}),
|
||||
re_path(r'^(?P<pk>[0-9]+)/access_list/$', WorkflowJobTemplateAccessList.as_view(), name='workflow_job_template_access_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/object_roles/$', WorkflowJobTemplateObjectRolesList.as_view(), name='workflow_job_template_object_roles_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/labels/$', WorkflowJobTemplateLabelList.as_view(), name='workflow_job_template_label_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/', include('awx.api.urls.webhooks'), {'model_kwarg': 'workflow_job_templates'}),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -1,7 +1,7 @@
|
||||
# Copyright (c) 2017 Ansible, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.conf.urls import url
|
||||
from django.urls import re_path
|
||||
|
||||
from awx.api.views import (
|
||||
WorkflowJobTemplateNodeList,
|
||||
@ -15,13 +15,13 @@ from awx.api.views import (
|
||||
|
||||
|
||||
urls = [
|
||||
url(r'^$', WorkflowJobTemplateNodeList.as_view(), name='workflow_job_template_node_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateNodeDetail.as_view(), name='workflow_job_template_node_detail'),
|
||||
url(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobTemplateNodeSuccessNodesList.as_view(), name='workflow_job_template_node_success_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobTemplateNodeFailureNodesList.as_view(), name='workflow_job_template_node_failure_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobTemplateNodeAlwaysNodesList.as_view(), name='workflow_job_template_node_always_nodes_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobTemplateNodeCredentialsList.as_view(), name='workflow_job_template_node_credentials_list'),
|
||||
url(r'^(?P<pk>[0-9]+)/create_approval_template/$', WorkflowJobTemplateNodeCreateApproval.as_view(), name='workflow_job_template_node_create_approval'),
|
||||
re_path(r'^$', WorkflowJobTemplateNodeList.as_view(), name='workflow_job_template_node_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/$', WorkflowJobTemplateNodeDetail.as_view(), name='workflow_job_template_node_detail'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/success_nodes/$', WorkflowJobTemplateNodeSuccessNodesList.as_view(), name='workflow_job_template_node_success_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/failure_nodes/$', WorkflowJobTemplateNodeFailureNodesList.as_view(), name='workflow_job_template_node_failure_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/always_nodes/$', WorkflowJobTemplateNodeAlwaysNodesList.as_view(), name='workflow_job_template_node_always_nodes_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/credentials/$', WorkflowJobTemplateNodeCredentialsList.as_view(), name='workflow_job_template_node_credentials_list'),
|
||||
re_path(r'^(?P<pk>[0-9]+)/create_approval_template/$', WorkflowJobTemplateNodeCreateApproval.as_view(), name='workflow_job_template_node_create_approval'),
|
||||
]
|
||||
|
||||
__all__ = ['urls']
|
||||
|
||||
@ -29,7 +29,7 @@ from django.views.decorators.csrf import csrf_exempt
|
||||
from django.template.loader import render_to_string
|
||||
from django.http import HttpResponse
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
|
||||
# Django REST Framework
|
||||
@ -62,7 +62,7 @@ import pytz
|
||||
from wsgiref.util import FileWrapper
|
||||
|
||||
# AWX
|
||||
from awx.main.tasks import send_notifications, update_inventory_computed_fields
|
||||
from awx.main.tasks.system import send_notifications, update_inventory_computed_fields
|
||||
from awx.main.access import get_user_queryset, HostAccess
|
||||
from awx.api.generics import (
|
||||
APIView,
|
||||
@ -105,7 +105,6 @@ from awx.api.permissions import (
|
||||
ProjectUpdatePermission,
|
||||
InventoryInventorySourcesUpdatePermission,
|
||||
UserPermission,
|
||||
InstanceGroupTowerPermission,
|
||||
VariableDataPermission,
|
||||
WorkflowApprovalPermission,
|
||||
IsSystemAdminOrAuditor,
|
||||
@ -113,7 +112,7 @@ from awx.api.permissions import (
|
||||
from awx.api import renderers
|
||||
from awx.api import serializers
|
||||
from awx.api.metadata import RoleMetadata
|
||||
from awx.main.constants import ACTIVE_STATES
|
||||
from awx.main.constants import ACTIVE_STATES, SURVEY_TYPE_MAPPING
|
||||
from awx.main.scheduler.dag_workflow import WorkflowDAG
|
||||
from awx.api.views.mixin import (
|
||||
ControlledByScmMixin,
|
||||
@ -157,8 +156,10 @@ from awx.api.views.inventory import ( # noqa
|
||||
InventoryAccessList,
|
||||
InventoryObjectRolesList,
|
||||
InventoryJobTemplateList,
|
||||
InventoryLabelList,
|
||||
InventoryCopy,
|
||||
)
|
||||
from awx.api.views.mesh_visualizer import MeshVisualizer # noqa
|
||||
from awx.api.views.root import ( # noqa
|
||||
ApiRootView,
|
||||
ApiOAuthAuthorizationRootView,
|
||||
@ -171,6 +172,7 @@ from awx.api.views.root import ( # noqa
|
||||
)
|
||||
from awx.api.views.webhooks import WebhookKeyView, GithubWebhookReceiver, GitlabWebhookReceiver # noqa
|
||||
from awx.api.pagination import UnifiedJobEventPagination
|
||||
from awx.main.utils import set_environ
|
||||
|
||||
|
||||
logger = logging.getLogger('awx.api.views')
|
||||
@ -363,6 +365,7 @@ class InstanceList(ListAPIView):
|
||||
model = models.Instance
|
||||
serializer_class = serializers.InstanceSerializer
|
||||
search_fields = ('hostname',)
|
||||
ordering = ('id',)
|
||||
|
||||
|
||||
class InstanceDetail(RetrieveUpdateAPIView):
|
||||
@ -406,6 +409,16 @@ class InstanceInstanceGroupsList(InstanceGroupMembershipMixin, SubListCreateAtta
|
||||
def is_valid_relation(self, parent, sub, created=False):
|
||||
if parent.node_type == 'control':
|
||||
return {'msg': _(f"Cannot change instance group membership of control-only node: {parent.hostname}.")}
|
||||
if parent.node_type == 'hop':
|
||||
return {'msg': _(f"Cannot change instance group membership of hop node : {parent.hostname}.")}
|
||||
return None
|
||||
|
||||
def is_valid_removal(self, parent, sub):
|
||||
res = self.is_valid_relation(parent, sub)
|
||||
if res:
|
||||
return res
|
||||
if sub.name == settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME and parent.node_type == 'hybrid':
|
||||
return {'msg': _(f"Cannot disassociate hybrid instance {parent.hostname} from {sub.name}.")}
|
||||
return None
|
||||
|
||||
|
||||
@ -416,6 +429,10 @@ class InstanceHealthCheck(GenericAPIView):
|
||||
serializer_class = serializers.InstanceHealthCheckSerializer
|
||||
permission_classes = (IsSystemAdminOrAuditor,)
|
||||
|
||||
def get_queryset(self):
|
||||
# FIXME: For now, we don't have a good way of checking the health of a hop node.
|
||||
return super().get_queryset().exclude(node_type='hop')
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
obj = self.get_object()
|
||||
data = self.get_serializer(data=request.data).to_representation(obj)
|
||||
@ -425,7 +442,7 @@ class InstanceHealthCheck(GenericAPIView):
|
||||
obj = self.get_object()
|
||||
|
||||
if obj.node_type == 'execution':
|
||||
from awx.main.tasks import execution_node_health_check
|
||||
from awx.main.tasks.system import execution_node_health_check
|
||||
|
||||
runner_data = execution_node_health_check(obj.hostname)
|
||||
obj.refresh_from_db()
|
||||
@ -435,7 +452,7 @@ class InstanceHealthCheck(GenericAPIView):
|
||||
if extra_field in runner_data:
|
||||
data[extra_field] = runner_data[extra_field]
|
||||
else:
|
||||
from awx.main.tasks import cluster_node_health_check
|
||||
from awx.main.tasks.system import cluster_node_health_check
|
||||
|
||||
if settings.CLUSTER_HOST_ID == obj.hostname:
|
||||
cluster_node_health_check(obj.hostname)
|
||||
@ -472,7 +489,6 @@ class InstanceGroupDetail(RelatedJobsPreventDeleteMixin, RetrieveUpdateDestroyAP
|
||||
name = _("Instance Group Detail")
|
||||
model = models.InstanceGroup
|
||||
serializer_class = serializers.InstanceGroupSerializer
|
||||
permission_classes = (InstanceGroupTowerPermission,)
|
||||
|
||||
def update_raw_data(self, data):
|
||||
if self.get_object().is_container_group:
|
||||
@ -503,6 +519,16 @@ class InstanceGroupInstanceList(InstanceGroupMembershipMixin, SubListAttachDetac
|
||||
def is_valid_relation(self, parent, sub, created=False):
|
||||
if sub.node_type == 'control':
|
||||
return {'msg': _(f"Cannot change instance group membership of control-only node: {sub.hostname}.")}
|
||||
if sub.node_type == 'hop':
|
||||
return {'msg': _(f"Cannot change instance group membership of hop node : {sub.hostname}.")}
|
||||
return None
|
||||
|
||||
def is_valid_removal(self, parent, sub):
|
||||
res = self.is_valid_relation(parent, sub)
|
||||
if res:
|
||||
return res
|
||||
if sub.node_type == 'hybrid' and parent.name == settings.DEFAULT_CONTROL_PLANE_QUEUE_NAME:
|
||||
return {'msg': _(f"Cannot disassociate hybrid node {sub.hostname} from {parent.name}.")}
|
||||
return None
|
||||
|
||||
|
||||
@ -511,6 +537,7 @@ class ScheduleList(ListCreateAPIView):
|
||||
name = _("Schedules")
|
||||
model = models.Schedule
|
||||
serializer_class = serializers.ScheduleSerializer
|
||||
ordering = ('id',)
|
||||
|
||||
|
||||
class ScheduleDetail(RetrieveUpdateDestroyAPIView):
|
||||
@ -1547,8 +1574,9 @@ class CredentialExternalTest(SubDetailAPIView):
|
||||
backend_kwargs[field_name] = value
|
||||
backend_kwargs.update(request.data.get('metadata', {}))
|
||||
try:
|
||||
obj.credential_type.plugin.backend(**backend_kwargs)
|
||||
return Response({}, status=status.HTTP_202_ACCEPTED)
|
||||
with set_environ(**settings.AWX_TASK_ENV):
|
||||
obj.credential_type.plugin.backend(**backend_kwargs)
|
||||
return Response({}, status=status.HTTP_202_ACCEPTED)
|
||||
except requests.exceptions.HTTPError as exc:
|
||||
message = 'HTTP {}'.format(exc.response.status_code)
|
||||
return Response({'inputs': message}, status=status.HTTP_400_BAD_REQUEST)
|
||||
@ -2458,8 +2486,6 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
obj_permission_type = 'admin'
|
||||
serializer_class = serializers.EmptySerializer
|
||||
|
||||
ALLOWED_TYPES = {'text': str, 'textarea': str, 'password': str, 'multiplechoice': str, 'multiselect': str, 'integer': int, 'float': float}
|
||||
|
||||
def get(self, request, *args, **kwargs):
|
||||
obj = self.get_object()
|
||||
return Response(obj.display_survey_spec())
|
||||
@ -2530,17 +2556,17 @@ class JobTemplateSurveySpec(GenericAPIView):
|
||||
# Type-specific validation
|
||||
# validate question type <-> default type
|
||||
qtype = survey_item["type"]
|
||||
if qtype not in JobTemplateSurveySpec.ALLOWED_TYPES:
|
||||
if qtype not in SURVEY_TYPE_MAPPING:
|
||||
return Response(
|
||||
dict(
|
||||
error=_("'{survey_item[type]}' in survey question {idx} is not one of '{allowed_types}' allowed question types.").format(
|
||||
allowed_types=', '.join(JobTemplateSurveySpec.ALLOWED_TYPES.keys()), **context
|
||||
allowed_types=', '.join(SURVEY_TYPE_MAPPING.keys()), **context
|
||||
)
|
||||
),
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
if 'default' in survey_item and survey_item['default'] != '':
|
||||
if not isinstance(survey_item['default'], JobTemplateSurveySpec.ALLOWED_TYPES[qtype]):
|
||||
if not isinstance(survey_item['default'], SURVEY_TYPE_MAPPING[qtype]):
|
||||
type_label = 'string'
|
||||
if qtype in ['integer', 'float']:
|
||||
type_label = qtype
|
||||
@ -3818,6 +3844,84 @@ class JobJobEventsList(BaseJobEventsList):
|
||||
return job.get_event_queryset().select_related('host').order_by('start_line')
|
||||
|
||||
|
||||
class JobJobEventsChildrenSummary(APIView):
|
||||
|
||||
renderer_classes = [JSONRenderer]
|
||||
meta_events = ('debug', 'verbose', 'warning', 'error', 'system_warning', 'deprecated')
|
||||
|
||||
def get(self, request, **kwargs):
|
||||
resp = dict(children_summary={}, meta_event_nested_uuid={}, event_processing_finished=False)
|
||||
job = get_object_or_404(models.Job, pk=kwargs['pk'])
|
||||
if not job.event_processing_finished:
|
||||
return Response(resp)
|
||||
else:
|
||||
resp["event_processing_finished"] = True
|
||||
|
||||
events = list(job.get_event_queryset().values('counter', 'uuid', 'parent_uuid', 'event').order_by('counter'))
|
||||
if len(events) == 0:
|
||||
return Response(resp)
|
||||
|
||||
# key is counter, value is number of total children (including children of children, etc.)
|
||||
map_counter_children_tally = {i['counter']: {"rowNumber": 0, "numChildren": 0} for i in events}
|
||||
# key is uuid, value is counter
|
||||
map_uuid_counter = {i['uuid']: i['counter'] for i in events}
|
||||
# key is uuid, value is parent uuid. Used as a quick lookup
|
||||
map_uuid_puuid = {i['uuid']: i['parent_uuid'] for i in events}
|
||||
# key is counter of meta events (i.e. verbose), value is uuid of the assigned parent
|
||||
map_meta_counter_nested_uuid = {}
|
||||
|
||||
prev_non_meta_event = events[0]
|
||||
for i, e in enumerate(events):
|
||||
if not e['event'] in JobJobEventsChildrenSummary.meta_events:
|
||||
prev_non_meta_event = e
|
||||
if not e['uuid']:
|
||||
continue
|
||||
puuid = e['parent_uuid']
|
||||
|
||||
# if event is verbose (or debug, etc), we need to "assign" it a
|
||||
# parent. This code looks at the event level of the previous
|
||||
# non-verbose event, and the level of the next (by looking ahead)
|
||||
# non-verbose event. The verbose event is assigned the same parent
|
||||
# uuid of the higher level event.
|
||||
# e.g.
|
||||
# E1
|
||||
# E2
|
||||
# verbose
|
||||
# verbose <- we are on this event currently
|
||||
# E4
|
||||
# We'll compare E2 and E4, and the verbose event
|
||||
# will be assigned the parent uuid of E4 (higher event level)
|
||||
if e['event'] in JobJobEventsChildrenSummary.meta_events:
|
||||
event_level_before = models.JobEvent.LEVEL_FOR_EVENT[prev_non_meta_event['event']]
|
||||
# find next non meta event
|
||||
z = i
|
||||
next_non_meta_event = events[-1]
|
||||
while z < len(events):
|
||||
if events[z]['event'] not in JobJobEventsChildrenSummary.meta_events:
|
||||
next_non_meta_event = events[z]
|
||||
break
|
||||
z += 1
|
||||
event_level_after = models.JobEvent.LEVEL_FOR_EVENT[next_non_meta_event['event']]
|
||||
if event_level_after and event_level_after > event_level_before:
|
||||
puuid = next_non_meta_event['parent_uuid']
|
||||
else:
|
||||
puuid = prev_non_meta_event['parent_uuid']
|
||||
if puuid:
|
||||
map_meta_counter_nested_uuid[e['counter']] = puuid
|
||||
map_counter_children_tally[e['counter']]['rowNumber'] = i
|
||||
if not puuid:
|
||||
continue
|
||||
# now traverse up the parent, grandparent, etc. events and tally those
|
||||
while puuid:
|
||||
map_counter_children_tally[map_uuid_counter[puuid]]['numChildren'] += 1
|
||||
puuid = map_uuid_puuid.get(puuid, None)
|
||||
|
||||
# create new dictionary, dropping events with 0 children
|
||||
resp["children_summary"] = {k: v for k, v in map_counter_children_tally.items() if v['numChildren'] != 0}
|
||||
resp["meta_event_nested_uuid"] = map_meta_counter_nested_uuid
|
||||
return Response(resp)
|
||||
|
||||
|
||||
class AdHocCommandList(ListCreateAPIView):
|
||||
|
||||
model = models.AdHocCommand
|
||||
|
||||
@ -8,7 +8,7 @@ import logging
|
||||
from django.conf import settings
|
||||
from django.db.models import Q
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
@ -16,17 +16,21 @@ from rest_framework.response import Response
|
||||
from rest_framework import status
|
||||
|
||||
# AWX
|
||||
from awx.main.models import (
|
||||
ActivityStream,
|
||||
Inventory,
|
||||
JobTemplate,
|
||||
Role,
|
||||
User,
|
||||
InstanceGroup,
|
||||
InventoryUpdateEvent,
|
||||
InventoryUpdate,
|
||||
from awx.main.models import ActivityStream, Inventory, JobTemplate, Role, User, InstanceGroup, InventoryUpdateEvent, InventoryUpdate
|
||||
|
||||
from awx.main.models.label import Label
|
||||
|
||||
from awx.api.generics import (
|
||||
ListCreateAPIView,
|
||||
RetrieveUpdateDestroyAPIView,
|
||||
SubListAPIView,
|
||||
SubListAttachDetachAPIView,
|
||||
ResourceAccessList,
|
||||
CopyAPIView,
|
||||
DeleteLastUnattachLabelMixin,
|
||||
SubListCreateAttachDetachAPIView,
|
||||
)
|
||||
from awx.api.generics import ListCreateAPIView, RetrieveUpdateDestroyAPIView, SubListAPIView, SubListAttachDetachAPIView, ResourceAccessList, CopyAPIView
|
||||
|
||||
|
||||
from awx.api.serializers import (
|
||||
InventorySerializer,
|
||||
@ -35,6 +39,7 @@ from awx.api.serializers import (
|
||||
InstanceGroupSerializer,
|
||||
InventoryUpdateEventSerializer,
|
||||
JobTemplateSerializer,
|
||||
LabelSerializer,
|
||||
)
|
||||
from awx.api.views.mixin import RelatedJobsPreventDeleteMixin, ControlledByScmMixin
|
||||
|
||||
@ -152,6 +157,30 @@ class InventoryJobTemplateList(SubListAPIView):
|
||||
return qs.filter(inventory=parent)
|
||||
|
||||
|
||||
class InventoryLabelList(DeleteLastUnattachLabelMixin, SubListCreateAttachDetachAPIView, SubListAPIView):
|
||||
|
||||
model = Label
|
||||
serializer_class = LabelSerializer
|
||||
parent_model = Inventory
|
||||
relationship = 'labels'
|
||||
|
||||
def post(self, request, *args, **kwargs):
|
||||
# If a label already exists in the database, attach it instead of erroring out
|
||||
# that it already exists
|
||||
if 'id' not in request.data and 'name' in request.data and 'organization' in request.data:
|
||||
existing = Label.objects.filter(name=request.data['name'], organization_id=request.data['organization'])
|
||||
if existing.exists():
|
||||
existing = existing[0]
|
||||
request.data['id'] = existing.id
|
||||
del request.data['name']
|
||||
del request.data['organization']
|
||||
if Label.objects.filter(inventory_labels=self.kwargs['pk']).count() > 100:
|
||||
return Response(
|
||||
dict(msg=_('Maximum number of labels for {} reached.'.format(self.parent_model._meta.verbose_name_raw))), status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
return super(InventoryLabelList, self).post(request, *args, **kwargs)
|
||||
|
||||
|
||||
class InventoryCopy(CopyAPIView):
|
||||
|
||||
model = Inventory
|
||||
|
||||
25
awx/api/views/mesh_visualizer.py
Normal file
25
awx/api/views/mesh_visualizer.py
Normal file
@ -0,0 +1,25 @@
|
||||
# Copyright (c) 2018 Red Hat, Inc.
|
||||
# All Rights Reserved.
|
||||
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from awx.api.generics import APIView, Response
|
||||
from awx.api.permissions import IsSystemAdminOrAuditor
|
||||
from awx.api.serializers import InstanceLinkSerializer, InstanceNodeSerializer
|
||||
from awx.main.models import InstanceLink, Instance
|
||||
|
||||
|
||||
class MeshVisualizer(APIView):
|
||||
|
||||
name = _("Mesh Visualizer")
|
||||
permission_classes = (IsSystemAdminOrAuditor,)
|
||||
swagger_topic = "System Configuration"
|
||||
|
||||
def get(self, request, format=None):
|
||||
|
||||
data = {
|
||||
'nodes': InstanceNodeSerializer(Instance.objects.all(), many=True).data,
|
||||
'links': InstanceLinkSerializer(InstanceLink.objects.select_related('target', 'source'), many=True).data,
|
||||
}
|
||||
|
||||
return Response(data)
|
||||
@ -5,7 +5,7 @@
|
||||
import logging
|
||||
|
||||
# Django
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.response import Response
|
||||
|
||||
@ -8,7 +8,7 @@ from django.db.models import Count
|
||||
from django.db import transaction
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.utils.timezone import now
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from rest_framework.permissions import SAFE_METHODS
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
@ -68,49 +68,27 @@ class InstanceGroupMembershipMixin(object):
|
||||
membership.
|
||||
"""
|
||||
|
||||
def attach_validate(self, request):
|
||||
parent = self.get_parent_object()
|
||||
sub_id, res = super().attach_validate(request)
|
||||
if res: # handle an error
|
||||
return sub_id, res
|
||||
sub = get_object_or_400(self.model, pk=sub_id)
|
||||
attach_errors = self.is_valid_relation(parent, sub)
|
||||
if attach_errors:
|
||||
return sub_id, Response(attach_errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
return sub_id, res
|
||||
|
||||
def attach(self, request, *args, **kwargs):
|
||||
response = super(InstanceGroupMembershipMixin, self).attach(request, *args, **kwargs)
|
||||
sub_id, res = self.attach_validate(request)
|
||||
if status.is_success(response.status_code):
|
||||
sub_id = request.data.get('id', None)
|
||||
if self.parent_model is Instance:
|
||||
inst_name = self.get_parent_object().hostname
|
||||
else:
|
||||
inst_name = get_object_or_400(self.model, pk=sub_id).hostname
|
||||
with transaction.atomic():
|
||||
ig_qs = InstanceGroup.objects.select_for_update()
|
||||
instance_groups_queryset = InstanceGroup.objects.select_for_update()
|
||||
if self.parent_model is Instance:
|
||||
ig_obj = get_object_or_400(ig_qs, pk=sub_id)
|
||||
ig_obj = get_object_or_400(instance_groups_queryset, pk=sub_id)
|
||||
else:
|
||||
# similar to get_parent_object, but selected for update
|
||||
parent_filter = {self.lookup_field: self.kwargs.get(self.lookup_field, None)}
|
||||
ig_obj = get_object_or_404(ig_qs, **parent_filter)
|
||||
ig_obj = get_object_or_404(instance_groups_queryset, **parent_filter)
|
||||
if inst_name not in ig_obj.policy_instance_list:
|
||||
ig_obj.policy_instance_list.append(inst_name)
|
||||
ig_obj.save(update_fields=['policy_instance_list'])
|
||||
return response
|
||||
|
||||
def unattach_validate(self, request):
|
||||
parent = self.get_parent_object()
|
||||
(sub_id, res) = super(InstanceGroupMembershipMixin, self).unattach_validate(request)
|
||||
if res:
|
||||
return (sub_id, res)
|
||||
sub = get_object_or_400(self.model, pk=sub_id)
|
||||
attach_errors = self.is_valid_relation(parent, sub)
|
||||
if attach_errors:
|
||||
return (sub_id, Response(attach_errors, status=status.HTTP_400_BAD_REQUEST))
|
||||
return (sub_id, res)
|
||||
|
||||
def unattach(self, request, *args, **kwargs):
|
||||
response = super(InstanceGroupMembershipMixin, self).unattach(request, *args, **kwargs)
|
||||
if status.is_success(response.status_code):
|
||||
@ -120,13 +98,13 @@ class InstanceGroupMembershipMixin(object):
|
||||
else:
|
||||
inst_name = get_object_or_400(self.model, pk=sub_id).hostname
|
||||
with transaction.atomic():
|
||||
ig_qs = InstanceGroup.objects.select_for_update()
|
||||
instance_groups_queryset = InstanceGroup.objects.select_for_update()
|
||||
if self.parent_model is Instance:
|
||||
ig_obj = get_object_or_400(ig_qs, pk=sub_id)
|
||||
ig_obj = get_object_or_400(instance_groups_queryset, pk=sub_id)
|
||||
else:
|
||||
# similar to get_parent_object, but selected for update
|
||||
parent_filter = {self.lookup_field: self.kwargs.get(self.lookup_field, None)}
|
||||
ig_obj = get_object_or_404(ig_qs, **parent_filter)
|
||||
ig_obj = get_object_or_404(instance_groups_queryset, **parent_filter)
|
||||
if inst_name in ig_obj.policy_instance_list:
|
||||
ig_obj.policy_instance_list.pop(ig_obj.policy_instance_list.index(inst_name))
|
||||
ig_obj.save(update_fields=['policy_instance_list'])
|
||||
|
||||
@ -7,7 +7,7 @@ import logging
|
||||
# Django
|
||||
from django.db.models import Count
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# AWX
|
||||
from awx.main.models import (
|
||||
|
||||
@ -8,11 +8,11 @@ import operator
|
||||
from collections import OrderedDict
|
||||
|
||||
from django.conf import settings
|
||||
from django.utils.encoding import smart_text
|
||||
from django.utils.encoding import smart_str
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.csrf import ensure_csrf_cookie
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from rest_framework.permissions import AllowAny, IsAuthenticated
|
||||
from rest_framework.response import Response
|
||||
@ -123,6 +123,7 @@ class ApiVersionRootView(APIView):
|
||||
data['workflow_approvals'] = reverse('api:workflow_approval_list', request=request)
|
||||
data['workflow_job_template_nodes'] = reverse('api:workflow_job_template_node_list', request=request)
|
||||
data['workflow_job_nodes'] = reverse('api:workflow_job_node_list', request=request)
|
||||
data['mesh_visualizer'] = reverse('api:mesh_visualizer_view', request=request)
|
||||
return Response(data)
|
||||
|
||||
|
||||
@ -149,13 +150,13 @@ class ApiV2PingView(APIView):
|
||||
response = {'ha': is_ha_environment(), 'version': get_awx_version(), 'active_node': settings.CLUSTER_HOST_ID, 'install_uuid': settings.INSTALL_UUID}
|
||||
|
||||
response['instances'] = []
|
||||
for instance in Instance.objects.all():
|
||||
for instance in Instance.objects.exclude(node_type='hop'):
|
||||
response['instances'].append(
|
||||
dict(
|
||||
node=instance.hostname,
|
||||
node_type=instance.node_type,
|
||||
uuid=instance.uuid,
|
||||
heartbeat=instance.modified,
|
||||
heartbeat=instance.last_seen,
|
||||
capacity=instance.capacity,
|
||||
version=instance.version,
|
||||
)
|
||||
@ -204,7 +205,7 @@ class ApiV2SubscriptionView(APIView):
|
||||
elif isinstance(exc, (ValueError, OSError)) and exc.args:
|
||||
msg = exc.args[0]
|
||||
else:
|
||||
logger.exception(smart_text(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
logger.exception(smart_str(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
return Response(validated)
|
||||
@ -245,7 +246,7 @@ class ApiV2AttachView(APIView):
|
||||
elif isinstance(exc, (ValueError, OSError)) and exc.args:
|
||||
msg = exc.args[0]
|
||||
else:
|
||||
logger.exception(smart_text(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
logger.exception(smart_str(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
return Response({"error": msg}, status=status.HTTP_400_BAD_REQUEST)
|
||||
for sub in validated:
|
||||
if sub['pool_id'] == pool_id:
|
||||
@ -321,7 +322,7 @@ class ApiV2ConfigView(APIView):
|
||||
try:
|
||||
data_actual = json.dumps(request.data)
|
||||
except Exception:
|
||||
logger.info(smart_text(u"Invalid JSON submitted for license."), extra=dict(actor=request.user.username))
|
||||
logger.info(smart_str(u"Invalid JSON submitted for license."), extra=dict(actor=request.user.username))
|
||||
return Response({"error": _("Invalid JSON")}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
license_data = json.loads(data_actual)
|
||||
@ -345,7 +346,7 @@ class ApiV2ConfigView(APIView):
|
||||
try:
|
||||
license_data_validated = get_licenser().license_from_manifest(license_data)
|
||||
except Exception:
|
||||
logger.warning(smart_text(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
logger.warning(smart_str(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
return Response({"error": _("Invalid License")}, status=status.HTTP_400_BAD_REQUEST)
|
||||
else:
|
||||
license_data_validated = get_licenser().validate()
|
||||
@ -356,7 +357,7 @@ class ApiV2ConfigView(APIView):
|
||||
settings.TOWER_URL_BASE = "{}://{}".format(request.scheme, request.get_host())
|
||||
return Response(license_data_validated)
|
||||
|
||||
logger.warning(smart_text(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
logger.warning(smart_str(u"Invalid subscription submitted."), extra=dict(actor=request.user.username))
|
||||
return Response({"error": _("Invalid subscription")}, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
def delete(self, request):
|
||||
|
||||
@ -4,7 +4,7 @@ import logging
|
||||
import urllib.parse
|
||||
|
||||
from django.utils.encoding import force_bytes
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.views.decorators.csrf import csrf_exempt
|
||||
|
||||
from rest_framework import status
|
||||
@ -16,7 +16,7 @@ from awx.api import serializers
|
||||
from awx.api.generics import APIView, GenericAPIView
|
||||
from awx.api.permissions import WebhookKeyPermission
|
||||
from awx.main.models import Job, JobTemplate, WorkflowJob, WorkflowJobTemplate
|
||||
|
||||
from awx.main.constants import JOB_VARIABLE_PREFIXES
|
||||
|
||||
logger = logging.getLogger('awx.api.views.webhooks')
|
||||
|
||||
@ -136,15 +136,16 @@ class WebhookReceiverBase(APIView):
|
||||
'webhook_credential': obj.webhook_credential,
|
||||
'webhook_guid': event_guid,
|
||||
},
|
||||
'extra_vars': {
|
||||
'tower_webhook_event_type': event_type,
|
||||
'tower_webhook_event_guid': event_guid,
|
||||
'tower_webhook_event_ref': event_ref,
|
||||
'tower_webhook_status_api': status_api,
|
||||
'tower_webhook_payload': request.data,
|
||||
},
|
||||
'extra_vars': {},
|
||||
}
|
||||
|
||||
for name in JOB_VARIABLE_PREFIXES:
|
||||
kwargs['extra_vars']['{}_webhook_event_type'.format(name)] = event_type
|
||||
kwargs['extra_vars']['{}_webhook_event_guid'.format(name)] = event_guid
|
||||
kwargs['extra_vars']['{}_webhook_event_ref'.format(name)] = event_ref
|
||||
kwargs['extra_vars']['{}_webhook_status_api'.format(name)] = status_api
|
||||
kwargs['extra_vars']['{}_webhook_payload'.format(name)] = request.data
|
||||
|
||||
new_job = obj.create_unified_job(**kwargs)
|
||||
new_job.signal_start()
|
||||
|
||||
|
||||
@ -7,8 +7,6 @@ from django.utils.module_loading import autodiscover_modules
|
||||
# AWX
|
||||
from .registry import settings_registry
|
||||
|
||||
default_app_config = 'awx.conf.apps.ConfConfig'
|
||||
|
||||
|
||||
def register(setting, **kwargs):
|
||||
settings_registry.register(setting, **kwargs)
|
||||
|
||||
@ -1,8 +1,10 @@
|
||||
import sys
|
||||
|
||||
# Django
|
||||
from django.apps import AppConfig
|
||||
|
||||
# from django.core import checks
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
|
||||
class ConfConfig(AppConfig):
|
||||
@ -12,6 +14,9 @@ class ConfConfig(AppConfig):
|
||||
|
||||
def ready(self):
|
||||
self.module.autodiscover()
|
||||
from .settings import SettingsWrapper
|
||||
|
||||
SettingsWrapper.initialize()
|
||||
if not set(sys.argv) & {'migrate', 'check_migrations'}:
|
||||
|
||||
from .settings import SettingsWrapper
|
||||
|
||||
SettingsWrapper.initialize()
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Django
|
||||
from django.conf import settings
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# AWX
|
||||
from awx.conf import fields, register
|
||||
|
||||
@ -7,12 +7,15 @@ from collections import OrderedDict
|
||||
|
||||
# Django
|
||||
from django.core.validators import URLValidator, _lazy_re_compile
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
# Django REST Framework
|
||||
from rest_framework.fields import BooleanField, CharField, ChoiceField, DictField, DateTimeField, EmailField, IntegerField, ListField, NullBooleanField # noqa
|
||||
from rest_framework.fields import BooleanField, CharField, ChoiceField, DictField, DateTimeField, EmailField, IntegerField, ListField # noqa
|
||||
from rest_framework.serializers import PrimaryKeyRelatedField # noqa
|
||||
|
||||
# AWX
|
||||
from awx.main.constants import CONTAINER_VOLUMES_MOUNT_TYPES, MAX_ISOLATED_PATH_COLON_DELIMITER
|
||||
|
||||
logger = logging.getLogger('awx.conf.fields')
|
||||
|
||||
# Use DRF fields to convert/validate settings:
|
||||
@ -62,11 +65,11 @@ class StringListBooleanField(ListField):
|
||||
try:
|
||||
if isinstance(value, (list, tuple)):
|
||||
return super(StringListBooleanField, self).to_representation(value)
|
||||
elif value in NullBooleanField.TRUE_VALUES:
|
||||
elif value in BooleanField.TRUE_VALUES:
|
||||
return True
|
||||
elif value in NullBooleanField.FALSE_VALUES:
|
||||
elif value in BooleanField.FALSE_VALUES:
|
||||
return False
|
||||
elif value in NullBooleanField.NULL_VALUES:
|
||||
elif value in BooleanField.NULL_VALUES:
|
||||
return None
|
||||
elif isinstance(value, str):
|
||||
return self.child.to_representation(value)
|
||||
@ -79,11 +82,11 @@ class StringListBooleanField(ListField):
|
||||
try:
|
||||
if isinstance(data, (list, tuple)):
|
||||
return super(StringListBooleanField, self).to_internal_value(data)
|
||||
elif data in NullBooleanField.TRUE_VALUES:
|
||||
elif data in BooleanField.TRUE_VALUES:
|
||||
return True
|
||||
elif data in NullBooleanField.FALSE_VALUES:
|
||||
elif data in BooleanField.FALSE_VALUES:
|
||||
return False
|
||||
elif data in NullBooleanField.NULL_VALUES:
|
||||
elif data in BooleanField.NULL_VALUES:
|
||||
return None
|
||||
elif isinstance(data, str):
|
||||
return self.child.run_validation(data)
|
||||
@ -109,6 +112,49 @@ class StringListPathField(StringListField):
|
||||
self.fail('type_error', input_type=type(paths))
|
||||
|
||||
|
||||
class StringListIsolatedPathField(StringListField):
|
||||
# Valid formats
|
||||
# '/etc/pki/ca-trust'
|
||||
# '/etc/pki/ca-trust:/etc/pki/ca-trust'
|
||||
# '/etc/pki/ca-trust:/etc/pki/ca-trust:O'
|
||||
|
||||
default_error_messages = {
|
||||
'type_error': _('Expected list of strings but got {input_type} instead.'),
|
||||
'path_error': _('{path} is not a valid path choice. You must provide an absolute path.'),
|
||||
'mount_error': _('{scontext} is not a valid mount option. Allowed types are {mount_types}'),
|
||||
'syntax_error': _('Invalid syntax. A string HOST-DIR[:CONTAINER-DIR[:OPTIONS]] is expected but got {path}.'),
|
||||
}
|
||||
|
||||
def to_internal_value(self, paths):
|
||||
|
||||
if isinstance(paths, (list, tuple)):
|
||||
for p in paths:
|
||||
if not isinstance(p, str):
|
||||
self.fail('type_error', input_type=type(p))
|
||||
if not p.startswith('/'):
|
||||
self.fail('path_error', path=p)
|
||||
|
||||
if p.count(':'):
|
||||
if p.count(':') > MAX_ISOLATED_PATH_COLON_DELIMITER:
|
||||
self.fail('syntax_error', path=p)
|
||||
try:
|
||||
src, dest, scontext = p.split(':')
|
||||
except ValueError:
|
||||
scontext = 'z'
|
||||
src, dest = p.split(':')
|
||||
finally:
|
||||
for sp in [src, dest]:
|
||||
if not len(sp):
|
||||
self.fail('syntax_error', path=sp)
|
||||
if not sp.startswith('/'):
|
||||
self.fail('path_error', path=sp)
|
||||
if scontext not in CONTAINER_VOLUMES_MOUNT_TYPES:
|
||||
self.fail('mount_error', scontext=scontext, mount_types=CONTAINER_VOLUMES_MOUNT_TYPES)
|
||||
return super(StringListIsolatedPathField, self).to_internal_value(sorted(paths))
|
||||
else:
|
||||
self.fail('type_error', input_type=type(paths))
|
||||
|
||||
|
||||
class URLField(CharField):
|
||||
# these lines set up a custom regex that allow numbers in the
|
||||
# top-level domain
|
||||
|
||||
@ -2,9 +2,10 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from django.db import migrations, models
|
||||
import jsonfield.fields
|
||||
from django.conf import settings
|
||||
|
||||
import awx.main.fields
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
@ -18,7 +19,7 @@ class Migration(migrations.Migration):
|
||||
('created', models.DateTimeField(default=None, editable=False)),
|
||||
('modified', models.DateTimeField(default=None, editable=False)),
|
||||
('key', models.CharField(max_length=255)),
|
||||
('value', jsonfield.fields.JSONField(null=True)),
|
||||
('value', awx.main.fields.JSONBlob(null=True)),
|
||||
(
|
||||
'user',
|
||||
models.ForeignKey(related_name='settings', default=None, editable=False, to=settings.AUTH_USER_MODEL, on_delete=models.CASCADE, null=True),
|
||||
|
||||
@ -2,6 +2,7 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
import awx.main.fields
|
||||
|
||||
|
||||
@ -9,4 +10,4 @@ class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [('conf', '0002_v310_copy_tower_settings')]
|
||||
|
||||
operations = [migrations.AlterField(model_name='setting', name='value', field=awx.main.fields.JSONField(null=True))]
|
||||
operations = [migrations.AlterField(model_name='setting', name='value', field=awx.main.fields.JSONBlob(null=True))]
|
||||
|
||||
@ -5,7 +5,7 @@ from django.utils.timezone import now
|
||||
|
||||
|
||||
def fill_ldap_group_type_params(apps, schema_editor):
|
||||
group_type = settings.AUTH_LDAP_GROUP_TYPE
|
||||
group_type = getattr(settings, 'AUTH_LDAP_GROUP_TYPE', None)
|
||||
Setting = apps.get_model('conf', 'Setting')
|
||||
|
||||
group_type_params = {'name_attr': 'cn', 'member_attr': 'member'}
|
||||
@ -17,7 +17,7 @@ def fill_ldap_group_type_params(apps, schema_editor):
|
||||
else:
|
||||
entry = Setting(key='AUTH_LDAP_GROUP_TYPE_PARAMS', value=group_type_params, created=now(), modified=now())
|
||||
|
||||
init_attrs = set(inspect.getargspec(group_type.__init__).args[1:])
|
||||
init_attrs = set(inspect.getfullargspec(group_type.__init__).args[1:])
|
||||
for k in list(group_type_params.keys()):
|
||||
if k not in init_attrs:
|
||||
del group_type_params[k]
|
||||
|
||||
@ -8,8 +8,8 @@ import json
|
||||
from django.db import models
|
||||
|
||||
# AWX
|
||||
from awx.main.fields import JSONBlob
|
||||
from awx.main.models.base import CreatedModifiedModel, prevent_search
|
||||
from awx.main.fields import JSONField
|
||||
from awx.main.utils import encrypt_field
|
||||
from awx.conf import settings_registry
|
||||
|
||||
@ -19,7 +19,7 @@ __all__ = ['Setting']
|
||||
class Setting(CreatedModifiedModel):
|
||||
|
||||
key = models.CharField(max_length=255)
|
||||
value = JSONField(null=True)
|
||||
value = JSONBlob(null=True)
|
||||
user = prevent_search(models.ForeignKey('auth.User', related_name='settings', default=None, null=True, editable=False, on_delete=models.CASCADE))
|
||||
|
||||
def __str__(self):
|
||||
|
||||
@ -8,7 +8,7 @@ import logging
|
||||
# Django
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.utils.text import slugify
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from awx.conf.license import get_license
|
||||
|
||||
|
||||
@ -1,7 +1,6 @@
|
||||
# Python
|
||||
import contextlib
|
||||
import logging
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
import os
|
||||
@ -31,7 +30,7 @@ from awx.conf.models import Setting
|
||||
|
||||
logger = logging.getLogger('awx.conf.settings')
|
||||
|
||||
SETTING_MEMORY_TTL = 5 if 'callback_receiver' in ' '.join(sys.argv) else 0
|
||||
SETTING_MEMORY_TTL = 5
|
||||
|
||||
# Store a special value to indicate when a setting is not set in the database.
|
||||
SETTING_CACHE_NOTSET = '___notset___'
|
||||
@ -81,17 +80,16 @@ def _ctit_db_wrapper(trans_safe=False):
|
||||
yield
|
||||
except DBError as exc:
|
||||
if trans_safe:
|
||||
if 'migrate' not in sys.argv and 'check_migrations' not in sys.argv:
|
||||
level = logger.exception
|
||||
if isinstance(exc, ProgrammingError):
|
||||
if 'relation' in str(exc) and 'does not exist' in str(exc):
|
||||
# this generally means we can't fetch Tower configuration
|
||||
# because the database hasn't actually finished migrating yet;
|
||||
# this is usually a sign that a service in a container (such as ws_broadcast)
|
||||
# has come up *before* the database has finished migrating, and
|
||||
# especially that the conf.settings table doesn't exist yet
|
||||
level = logger.debug
|
||||
level('Database settings are not available, using defaults.')
|
||||
level = logger.exception
|
||||
if isinstance(exc, ProgrammingError):
|
||||
if 'relation' in str(exc) and 'does not exist' in str(exc):
|
||||
# this generally means we can't fetch Tower configuration
|
||||
# because the database hasn't actually finished migrating yet;
|
||||
# this is usually a sign that a service in a container (such as ws_broadcast)
|
||||
# has come up *before* the database has finished migrating, and
|
||||
# especially that the conf.settings table doesn't exist yet
|
||||
level = logger.debug
|
||||
level('Database settings are not available, using defaults.')
|
||||
else:
|
||||
logger.exception('Error modifying something related to database settings.')
|
||||
finally:
|
||||
@ -235,6 +233,8 @@ class SettingsWrapper(UserSettingsHolder):
|
||||
self.__dict__['_awx_conf_init_readonly'] = False
|
||||
self.__dict__['cache'] = EncryptedCacheProxy(cache, registry)
|
||||
self.__dict__['registry'] = registry
|
||||
self.__dict__['_awx_conf_memoizedcache'] = cachetools.TTLCache(maxsize=2048, ttl=SETTING_MEMORY_TTL)
|
||||
self.__dict__['_awx_conf_memoizedcache_lock'] = threading.Lock()
|
||||
|
||||
# record the current pid so we compare it post-fork for
|
||||
# processes like the dispatcher and callback receiver
|
||||
@ -397,12 +397,20 @@ class SettingsWrapper(UserSettingsHolder):
|
||||
def SETTINGS_MODULE(self):
|
||||
return self._get_default('SETTINGS_MODULE')
|
||||
|
||||
@cachetools.cached(cache=cachetools.TTLCache(maxsize=2048, ttl=SETTING_MEMORY_TTL))
|
||||
@cachetools.cachedmethod(
|
||||
cache=lambda self: self.__dict__['_awx_conf_memoizedcache'],
|
||||
key=lambda *args, **kwargs: SettingsWrapper.hashkey(*args, **kwargs),
|
||||
lock=lambda self: self.__dict__['_awx_conf_memoizedcache_lock'],
|
||||
)
|
||||
def _get_local_with_cache(self, name):
|
||||
"""Get value while accepting the in-memory cache if key is available"""
|
||||
with _ctit_db_wrapper(trans_safe=True):
|
||||
return self._get_local(name)
|
||||
|
||||
def __getattr__(self, name):
|
||||
value = empty
|
||||
if name in self.all_supported_settings:
|
||||
with _ctit_db_wrapper(trans_safe=True):
|
||||
value = self._get_local(name)
|
||||
value = self._get_local_with_cache(name)
|
||||
if value is not empty:
|
||||
return value
|
||||
return self._get_default(name)
|
||||
@ -476,6 +484,23 @@ class SettingsWrapper(UserSettingsHolder):
|
||||
set_on_default = getattr(self.default_settings, 'is_overridden', lambda s: False)(setting)
|
||||
return set_locally or set_on_default
|
||||
|
||||
@classmethod
|
||||
def hashkey(cls, *args, **kwargs):
|
||||
"""
|
||||
Usage of @cachetools.cached has changed to @cachetools.cachedmethod
|
||||
The previous cachetools decorator called the hash function and passed in (self, key).
|
||||
The new cachtools decorator calls the hash function with just (key).
|
||||
Ideally, we would continue to pass self, however, the cachetools decorator interface
|
||||
does not allow us to.
|
||||
|
||||
This hashkey function is to maintain that the key generated looks like
|
||||
('<SettingsWrapper>', key). The thought is that maybe it is important to namespace
|
||||
our cache to the SettingsWrapper scope in case some other usage of this cache exists.
|
||||
I can not think of how any other system could and would use our private cache, but
|
||||
for safety sake we are ensuring the key schema does not change.
|
||||
"""
|
||||
return cachetools.keys.hashkey(f"<{cls.__name__}>", *args, **kwargs)
|
||||
|
||||
|
||||
def __getattr_without_cache__(self, name):
|
||||
# Django 1.10 added an optimization to settings lookup:
|
||||
|
||||
@ -28,6 +28,9 @@ def handle_setting_change(key, for_delete=False):
|
||||
cache_keys = {Setting.get_cache_key(k) for k in setting_keys}
|
||||
cache.delete_many(cache_keys)
|
||||
|
||||
# if we have changed a setting, we want to avoid mucking with the in-memory cache entirely
|
||||
settings._awx_conf_memoizedcache.clear()
|
||||
|
||||
# Send setting_changed signal with new value for each setting.
|
||||
for setting_key in setting_keys:
|
||||
setting_changed.send(sender=Setting, setting=setting_key, value=getattr(settings, setting_key, None), enter=not bool(for_delete))
|
||||
|
||||
@ -6,7 +6,7 @@ from uuid import uuid4
|
||||
from django.conf import LazySettings
|
||||
from django.core.cache.backends.locmem import LocMemCache
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from django.utils.translation import ugettext_lazy as _
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework.fields import empty
|
||||
import pytest
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user