Compare commits

...

355 Commits

Author SHA1 Message Date
softwarefactory-project-zuul[bot]
fa1c33da7e Merge pull request #6704 from ryanpetrello/11-0-0-release-version
bump version for 11.0.0

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-16 13:41:56 +00:00
Ryan Petrello
8ed5964871 bump version for 11.0.0 2020-04-15 22:10:12 -04:00
softwarefactory-project-zuul[bot]
a989c624c7 Merge pull request #6724 from chrismeyersfsu/fix-redis_not_registering_disconnect
reconnect when a vanilla server disconnect happens

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 23:38:47 +00:00
chris meyers
7f01de26a1 reconnect when a vanilla server disconnect happens 2020-04-15 19:02:33 -04:00
softwarefactory-project-zuul[bot]
e3b5d64aa7 Merge pull request #6722 from wenottingham/over-the-ramparts-we-no-longer-watch
Remove 'rampart' from a user-facing string.

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 21:52:57 +00:00
softwarefactory-project-zuul[bot]
eba0e4fd77 Merge pull request #6710 from rooftopcellist/rsyslog_rename_dir
Rename awx rsyslog socket and PID dir

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 20:43:40 +00:00
softwarefactory-project-zuul[bot]
d3c80eef4d Merge pull request #6560 from mabashian/5865-schedule-edit
Add support for editing proj/jt/wfjt schedule

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 20:21:56 +00:00
softwarefactory-project-zuul[bot]
3683dfab37 Merge pull request #6720 from chrismeyersfsu/feature-wsbroadcast_better_logging
wsbroadcast better logging and behavior

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 19:11:26 +00:00
Bill Nottingham
8e3931de37 Remove 'rampart' from a user-facing string. 2020-04-15 15:00:11 -04:00
mabashian
be0a7a2aa9 Pass route contents as child instead of using render prop 2020-04-15 14:33:35 -04:00
mabashian
d0d8d1c66c Fix schedule edit prop types error thrown on schedule prop 2020-04-15 14:33:35 -04:00
mabashian
8a8a48a4ff Fix prop types on schedule edit 2020-04-15 14:33:35 -04:00
mabashian
b0aa795b10 Remove rogue console.logs 2020-04-15 14:33:35 -04:00
mabashian
017064aecf Adds support for editing proj/jt/wfjt schedule 2020-04-15 14:33:35 -04:00
softwarefactory-project-zuul[bot]
7311ddf722 Merge pull request #6562 from AlexSCorey/6333-SurveyCleanUp
Fixes several things about Survey List

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 18:33:07 +00:00
Christian Adams
69835e9895 Write logs to /dev/null if logging is not enabled 2020-04-15 14:17:21 -04:00
Christian Adams
85960d9035 Volume mount supervisor dir to both containers 2020-04-15 14:11:15 -04:00
Christian Adams
c8ceb62269 Rename awx rsyslog socket and PID dir 2020-04-15 14:11:15 -04:00
chris meyers
1acca459ef nice error message when redis is down
* awx_manage run_wsbroadcast --status nice error message if someone
failed to start awx services (i.e. redis)
2020-04-15 13:28:13 -04:00
Alex Corey
ee6fda9f8a moves validator function 2020-04-15 13:06:30 -04:00
Alex Corey
a95632c349 Adds error handling and validation.
Also adresses small PR issues
2020-04-15 13:06:30 -04:00
Alex Corey
ed3b6385f1 Fixes several things about Survey List
Aligns Select All with other select buttons
Add required asterisk to those items that are required
Adds label for the Default and Question Type column
Adds chips for multiselect items.
Adds RBAC to add and edit survey.
Also fixes a bug where the survey was not reloading properly after edit
2020-04-15 13:06:30 -04:00
softwarefactory-project-zuul[bot]
3518fb0c17 Merge pull request #6717 from ryanpetrello/custom-cred-plugin-instructions
update custom credential plugin docs to point at an example project

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 15:39:21 +00:00
softwarefactory-project-zuul[bot]
1289f141d6 Merge pull request #6716 from beeankha/remove_check_mode_text
Remove 'supports_check_mode' Text from Converted Collection Modules

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 14:29:15 +00:00
Ryan Petrello
8464ec5c49 update custom credential plugin docs to point at an example project 2020-04-15 09:59:09 -04:00
beeankha
3bc5975b90 Remove 'supports_check_mode' text from converted Collection modules 2020-04-15 09:37:54 -04:00
softwarefactory-project-zuul[bot]
af7e9cb533 Merge pull request #6712 from ryanpetrello/tcp-timeout
properly implement TCP timeouts for external log aggregation

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 03:51:22 +00:00
softwarefactory-project-zuul[bot]
af2a8f9831 Merge pull request #6665 from wenottingham/moar-data-plz
Collect information on inventory sources

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-15 00:51:59 +00:00
Bill Nottingham
f99a43ffa6 Collect information on inventory sources
Also remove one minor query from smart inventory collection that will
never return anything.
2020-04-14 19:15:19 -04:00
Ryan Petrello
262d99fde6 properly implement TCP timeouts for external log aggregation
see: https://github.com/ansible/awx/issues/6683
2020-04-14 17:06:30 -04:00
chris meyers
63f56d33aa show user unsafe name
* We log stats using a safe hostname because of prometheus requirements.
However, when we display users the hostname we should use the Instance
hostname. This change outputs the Instance.hostname instead of the safe
prometheus name.
2020-04-14 16:59:34 -04:00
chris meyers
9cabf3ef4d do not include iso nodes in wsbroadcast status 2020-04-14 16:55:56 -04:00
softwarefactory-project-zuul[bot]
2855be9d26 Merge pull request #6689 from john-westcott-iv/collections_oauth_respect
Make the module util respect oauth token over username/password

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 20:33:53 +00:00
softwarefactory-project-zuul[bot]
2a4912df3e Merge pull request #6706 from ryanpetrello/rsyslog-restart-warn
make rsyslog service restarts a bit less noisy

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 19:03:28 +00:00
chris meyers
daa312d7ee log file for wsbroadcast 2020-04-14 14:21:23 -04:00
Ryan Petrello
e95938715a make rsyslog service restarts a bit less noisy 2020-04-14 14:18:30 -04:00
softwarefactory-project-zuul[bot]
f5d4f7858a Merge pull request #6684 from nixocio/update_ui_docs_naming
Add note about code style for ui_next

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 18:05:22 +00:00
softwarefactory-project-zuul[bot]
25e0efd0b7 Merge pull request #6698 from wenottingham/the-time-zone-is-for-loading-and-saving-only
Cast the start/end times with timezone.

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 17:53:35 +00:00
nixocio
47a007caee Add note about code sytle for ui_next
Add note about code style for `ui_next`.
2020-04-14 13:16:37 -04:00
Bill Nottingham
cd6d2ed53a Move the comma so unit test can filter things properly. 2020-04-14 13:12:03 -04:00
softwarefactory-project-zuul[bot]
4de61204c4 Merge pull request #6700 from AlanCoding/more_readme
Update AWX collection docs for release 11.0.0

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:55:14 +00:00
John Westcott IV
6b21f2042b Make the module util respect oauth token over username/password 2020-04-14 12:51:45 -04:00
softwarefactory-project-zuul[bot]
7820517734 Merge pull request #6664 from marshmalien/6530-wf-node-jt
Add JT wf node modal prompt details

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:46:20 +00:00
softwarefactory-project-zuul[bot]
2ba1288284 Merge pull request #6695 from ryanpetrello/memcached-cleanup
don't wait on memcached TCP

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:40:52 +00:00
softwarefactory-project-zuul[bot]
149f8a21a6 Merge pull request #6696 from ryanpetrello/rsyslog-splunk-extras
add a few minor logging changes to accomodate Splunk's API

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:40:19 +00:00
softwarefactory-project-zuul[bot]
602f2951b9 Merge pull request #6702 from ryanpetrello/rsyslogd-no-dev-log
rsyslogd: ignore /dev/log when we load imuxsock

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:33:50 +00:00
softwarefactory-project-zuul[bot]
b003f42e22 Merge pull request #6547 from AlexSCorey/6384-ConvertWFJTToHooks
Converts WFJTForm to Formik hooks

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:33:45 +00:00
softwarefactory-project-zuul[bot]
2ee2cd0bd9 Merge pull request #6688 from nixocio/ui_remove_console
Remove console.log

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-14 16:11:03 +00:00
AlanCoding
a79f2ff07a Update AWX collection docs for release 11.0.0 2020-04-14 12:06:26 -04:00
Ryan Petrello
75bb7cce22 don't wait on memcached TCP 2020-04-14 11:45:27 -04:00
Ryan Petrello
52a253ad18 add a few minor logging changes to accomodate Splunk's API
see: https://docs.splunk.com/Documentation/Splunk/8.0.3/Data/UsetheHTTPEventCollector
2020-04-14 11:45:04 -04:00
Ryan Petrello
0f74a05fea rsyslogd: ignore /dev/log when we load imuxsock 2020-04-14 11:34:58 -04:00
Alex Corey
440691387b Puts webhook key on the template object in WFJTEdit
Also adds aria-label to Label Select Options to improve test matchers
 Improves the name of the template payload in WFJTAdd and WFJTEdit
 Updates tests including a failing snapshot DeleteConfirmationModal
 test that was failing in devel
2020-04-14 11:11:50 -04:00
Alex Corey
27e6c2d47d Adds tests 2020-04-14 11:11:50 -04:00
Alex Corey
8b69b08991 Adds formik hook functionality to wfjt form 2020-04-14 11:11:50 -04:00
Marliana Lara
8714bde1b4 Wrap entire date/time string in <Trans> tag 2020-04-14 11:08:12 -04:00
Marliana Lara
28b84d0d71 Use delete operator instead of destructuring 2020-04-14 11:08:12 -04:00
Marliana Lara
c6111fface Partition base resource into defaults and overrides 2020-04-14 11:08:12 -04:00
Marliana Lara
98e8a09ad3 Add JT details to wf node modal 2020-04-14 11:08:11 -04:00
nixocio
3f9af8fe69 Remove console.log
Remove console.log
2020-04-14 11:07:52 -04:00
Ryan Petrello
dbe949a2c2 Merge pull request #6697 from chrismeyersfsu/fix-collection_tests
ensure last comma removed in select
2020-04-14 11:05:29 -04:00
Bill Nottingham
a296f64696 Cast the start/end times with timezone. 2020-04-14 10:53:57 -04:00
chris meyers
ee18400a33 ensure last comma removed in select
* We strip out the json select fields in our tests since it is an sql
lite database underneath. Ideally, we would do something fancier, but we
aren't. In doing this stipping, we could strip the last element in the
projection list. This would result in an extra dangling comma. This
commit removes the danging comma in the projection list after the
removal of JSON projections.
2020-04-14 10:44:02 -04:00
softwarefactory-project-zuul[bot]
98a4e85db4 Merge pull request #6108 from rooftopcellist/rsyslog
Replace our external logging feature with Rsyslog

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2020-04-14 13:40:41 +00:00
Ryan Petrello
f7f1bdf9c9 properly configure supervisorctl to point at the web volume mount 2020-04-13 21:56:52 -04:00
Ryan Petrello
69cf915a20 add rsyslogd block to the k8s supervisord config file 2020-04-13 20:25:53 -04:00
Ryan Petrello
9440785bdd properly set the group on the rsyslog config 2020-04-13 19:46:34 -04:00
Christian Adams
ca7c840d8c Fix permissions on rsyslog.conf for k8s 2020-04-13 19:33:23 -04:00
softwarefactory-project-zuul[bot]
f85bcae89f Merge pull request #6685 from marshmalien/fix-user-loading
Fix route bug in User view

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 20:00:25 +00:00
Christian Adams
a0e31b9c01 Map logging timeout value to healthchecktimeout for http in rsyslog config 2020-04-13 15:22:16 -04:00
softwarefactory-project-zuul[bot]
c414fd68a0 Merge pull request #6176 from Ladas/send_also_workflows_as_part_of_unified_jobs
Send also workflows as part of unified jobs and send all changes to jobs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 18:41:36 +00:00
softwarefactory-project-zuul[bot]
2830cdfdeb Merge pull request #6668 from nixocio/ui_refactor_users_functional
Modify Users component to be function based

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 18:35:57 +00:00
softwarefactory-project-zuul[bot]
07e9b46643 Merge pull request #6656 from jlmitch5/withoutWithRouter
excise withRouter from function components

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 18:35:53 +00:00
softwarefactory-project-zuul[bot]
1f01521213 Merge pull request #6651 from nixocio/ui_issue_5820
Rename SCM to Source Control

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 18:35:46 +00:00
Marliana Lara
8587461ac9 Fix loading bug in User view 2020-04-13 14:19:16 -04:00
nixocio
e54e5280f2 Modify Users component to be function based
Modify Users component to be function based.
2020-04-13 13:43:22 -04:00
softwarefactory-project-zuul[bot]
516a44ce73 Merge pull request #6662 from keithjgrant/5909-jt-launch-prompt-2
JT Launch Prompting (phase 2)

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 17:04:41 +00:00
Ryan Petrello
e52cebc28e rsyslogd: use %rawmsg-after-pri% instead of %msg%
after some prolonged RFC reading and tinkering w/ rsyslogd...

cpython's SysLogHandler doesn't emit RFC3164 formatted messages
in the format you'd expect; it's missing the ISO date, hostname, etc...
along with other header values; the handler implementation relies on you
to specify a syslog-like formatter (we've replaced all of this with our
own *custom* logstash-esque formatter that effectively outputs valid JSON
- without dates and other syslog header values prepended)

because of this unanticipated format, rsyslogd chokes when trying to
parse the message's parts;  AWX is emitting:

<priority>RAWJSON

...so the usage of `%msg%` isn't going to work for us, because rsyslog
tries to parse *all* of the possible headers (and yells, because it
can't find a date to parse):

see: https://www.rsyslog.com/files/temp/doc-indent/configuration/properties.html#message-properties

this is fine, because we don't *need* any of that message parsing
anyways; in the end, we're *just* interested in forwarding the raw
JSON/text content to the third party log handler
2020-04-13 11:44:00 -04:00
Ryan Petrello
bb5136cdae properly escape URL paths and querystrings for paths in logging settings 2020-04-13 11:44:00 -04:00
Ryan Petrello
b0db2b7bec add some exception handling for dealing with logging connection resets
when rsyslogd restarts due to config changes, there's a brief moment
where the socket will refuse connections on teardown; exception handling
is needed here to deal with that
2020-04-13 11:44:00 -04:00
Ryan Petrello
1000dc10fb an an rsyslogd config check to the logging test endpoint 2020-04-13 11:44:00 -04:00
Ryan Petrello
2a4b009f04 rsyslogd: use %rawmsg-after-pri% instead of %msg%
after some prolonged RFC reading and tinkering w/ rsyslogd...

cpython's SysLogHandler doesn't emit RFC3164 formatted messages
in the format you'd expect; it's missing the ISO date, hostname, etc...
along with other header values; the handler implementation relies on you
to specify a syslog-like formatter (we've replaced all of this with our
own *custom* logstash-esque formatter that effectively outputs valid JSON
- without dates and other syslog header values prepended)

because of this unanticipated format, rsyslogd chokes when trying to
parse the message's parts;  AWX is emitting:

<priority>RAWJSON

...so the usage of `%msg%` isn't going to work for us, because rsyslog
tries to parse *all* of the possible headers (and yells, because it
can't find a date to parse):

see: https://www.rsyslog.com/files/temp/doc-indent/configuration/properties.html#message-properties

this is fine, because we don't *need* any of that message parsing
anyways; in the end, we're *just* interested in forwarding the raw
JSON/text content to the third party log handler
2020-04-13 11:44:00 -04:00
Ryan Petrello
8cdd42307c clarify that logging username/password is only valid for HTTP/s 2020-04-13 11:44:00 -04:00
Ryan Petrello
269558876e only use a basic auth password for external logging if username is set 2020-04-13 11:44:00 -04:00
Ryan Petrello
bba680671b when writing the rsyslog config, do it post-commit
there's a race condition if we do this pre-commit where the correct
value isn't actually *persisted* to the database yet, and we end up
saving the *prior* setting values
2020-04-13 11:44:00 -04:00
Ryan Petrello
f70a76109c make rsyslog fall back to no-op if logging is disabled 2020-04-13 11:44:00 -04:00
Christian Adams
5d54877183 Add action to default rsyslog.conf so supervisor starts correctly the first time 2020-04-13 11:44:00 -04:00
Ryan Petrello
f7dac8e68d more external logging unit test fixups 2020-04-13 11:44:00 -04:00
Ryan Petrello
39648b4f0b fix up a few test and lint errors related to external logging 2020-04-13 11:44:00 -04:00
Christian Adams
b942fde59a Ensure log messages have valid json
- Fix messages getting contatenated at 8k
 - Fix rsyslog cutting off the opening brace of log messages
 - Make valid default conf and emit logs based on prescence of .sock and
 settings
2020-04-13 11:44:00 -04:00
Ryan Petrello
ce82b87d9f rsyslog hardening (fixing a few weird things we noticed) 2020-04-13 11:44:00 -04:00
Christian Adams
70391f96ae Revert rsyslog valid config to one that fails intentionally 2020-04-13 11:43:59 -04:00
Christian Adams
2329c1b797 Add rsyslog config to container from file for consistency 2020-04-13 11:43:59 -04:00
Christian Adams
470159b4d7 Enable innocuous but valid config for rsyslog if disabled 2020-04-13 11:43:59 -04:00
Christian Adams
e740340793 ConfigMap rsyslog conf files for k8 2020-04-13 11:43:59 -04:00
Christian Adams
4d5507d344 Add default rsyslog.conf without including /etc/rsyslog.conf 2020-04-13 11:43:59 -04:00
Christian Adams
d350551547 Tweaks to Test Button logic and cleans up flake8 and test failures 2020-04-13 11:43:59 -04:00
Christian Adams
7fd79b8e54 Remove unneeded logging sock variable 2020-04-13 11:43:59 -04:00
John Mitchell
eb12f45e8e add ngToast disable on timeout for log agg notifications, and disable test button until active test completes. 2020-04-13 11:43:59 -04:00
Christian Adams
fb047b1267 Add unit tests for reconfiguring rsyslog & for test endpoint 2020-04-13 11:43:59 -04:00
Christian Adams
d31c528257 Fix Logging settings "Test" button functionality 2020-04-13 11:43:59 -04:00
Christian Adams
996d7ce054 Move supervisor and rsyslog sock files to their own dirs under /var/run 2020-04-13 11:43:59 -04:00
Christian Adams
7040fcfd88 Fix container rsyslog dir permissions 2020-04-13 11:43:59 -04:00
John Mitchell
88ca4b63e6 update configure tower in tower test ui for log aggregator form 2020-04-13 11:43:59 -04:00
Shane McDonald
c0af3c537b Configure rsyslog to listen over a unix domain socket instead of a port
- Add a placeholder rsyslog.conf so it doesn't fail on start
 - Create access restricted directory for unix socket to be created in
 - Create RSyslogHandler to exit early when logging socket doesn't exist
 - Write updated logging settings when dispatcher comes up and restart rsyslog so they  take effect
 - Move rsyslogd to the web container and create rpc supervisor.sock
 - Add env var for supervisor.conf path
2020-04-13 11:43:59 -04:00
Christian Adams
f8afae308a Add rsyslog to supervisor for the task container
- Add proper paths for rsyslog's supervisor logs
 - Do not enable debug mode for rsyslogd
 - Include system rsyslog.conf, and specify tower logging conf when
   starting rsyslog.
2020-04-13 11:43:59 -04:00
Christian Adams
4cd0d60711 Properly handle logger paths and https/http configuration
- log aggregator url paths were not being passed to rsyslog
 - http log services like loggly will now truly use http and port 80
 - add rsyslog.pid to .gitignore
2020-04-13 11:43:59 -04:00
Christian Adams
955d57bce6 Upstream rsyslog packaging changes
- add rsyslog repo to Dockerfile for AWX installation
 - Update Library Notes for requests-futures removal
2020-04-13 11:43:59 -04:00
Ryan Petrello
589d27c88c POC: replace our external log aggregation feature with rsyslog
- this change adds rsyslog (https://github.com/rsyslog/rsyslog) as
  a new service that runs on every AWX node (managed by supervisord)
  in particular, this feature requires a recent version (v8.38+) of
  rsyslog that supports the omhttp module
  (https://github.com/rsyslog/rsyslog-doc/pull/750)
- the "external_logger" handler in AWX is now a SysLogHandler that ships
  logs to the local UDP port where rsyslog is configured to listen (by
  default, 51414)
- every time a LOG_AGGREGATOR_* setting is changed, every AWX node
  reconfigures and restarts its local instance of rsyslog so that its
  fowarding settings match what has been configured in AWX
- unlike the prior implementation, if the external logging aggregator
  (splunk/logstash) goes temporarily offline, rsyslog will retain the
  messages and ship them when the log aggregator is back online
- 4xx or 5xx level errors are recorded at /var/log/tower/external.err
2020-04-13 11:43:59 -04:00
softwarefactory-project-zuul[bot]
eafb751ecc Merge pull request #6679 from ryanpetrello/fix-6675
skip non-files when consuming events synced from isolated hosts

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 15:36:42 +00:00
softwarefactory-project-zuul[bot]
30ea66023f Merge pull request #6671 from wenottingham/even-moar-data-plz
Collect task timing, warnings, and deprecations from job events

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 15:06:46 +00:00
Ryan Petrello
9843e21632 skip non-files when consuming events synced from isolated hosts
see: https://github.com/ansible/awx/issues/6675
2020-04-13 10:14:10 -04:00
softwarefactory-project-zuul[bot]
6002beb231 Merge pull request #6677 from chrismeyersfsu/fix-spelling
fix spelling mistake in wsbroadcast status output

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-13 14:13:24 +00:00
chris meyers
9c6e42fd1b fix spelling mistake in wsbroadcast status output 2020-04-13 09:37:32 -04:00
softwarefactory-project-zuul[bot]
eeab4b90a5 Merge pull request #6568 from AlanCoding/whoops_not_changed
Do not set changed=True if the object did not change

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-10 00:16:02 +00:00
Keith Grant
7827a2aedd fix double-fetch of cred types in launch prompts 2020-04-09 16:07:06 -07:00
softwarefactory-project-zuul[bot]
a7f1a36ed8 Merge pull request #6670 from ryanpetrello/redis-fixup
work around redis connection failures in the callback receiver

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-09 21:41:08 +00:00
Bill Nottingham
d651786206 Collect task timing, warnings, and deprecations from job events
Timing information requires ansible-runner >= 1.4.6.
2020-04-09 17:27:19 -04:00
softwarefactory-project-zuul[bot]
19e4758be1 Merge pull request #6637 from john-westcott-iv/tower_workflow_job_lanch_update
Initial commit of tests for tower_workflow_launch

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-09 19:53:35 +00:00
softwarefactory-project-zuul[bot]
fe9de0d4cc Merge pull request #6658 from mabashian/6655-job-redirect
Fixes issue where job type redirects weren't firing correctly

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-09 19:47:20 +00:00
Ryan Petrello
80147acc1c work around redis connection failures in the callback receiver
if redis stops/starts, sometimes the callback receiver doesn't recover
without a restart; this fixes that
2020-04-09 15:38:03 -04:00
beeankha
4acdf8584b Update workflow_launch module and test playbook 2020-04-09 15:12:49 -04:00
beeankha
cf607691ac Pass data and errors more clearly, change extra_vars to be a dict, update test playbook to have a task utilizing extra_vars 2020-04-09 12:40:13 -04:00
beeankha
d7adcfb119 Revert unnecessary changes made to test playbook during rebase 2020-04-09 12:38:06 -04:00
beeankha
97d26728e4 Fix one more linter issue 2020-04-09 12:38:06 -04:00
John Westcott IV
6403895eae Puting tasks back to natural order 2020-04-09 12:38:06 -04:00
beeankha
8b26ff1fe6 Fix linter errors 2020-04-09 12:38:06 -04:00
beeankha
9ddd020348 Fix sanity tests and edit test playbook 2020-04-09 12:38:06 -04:00
John Westcott IV
a2d1c32da3 Initial commit of tests for tower_workflow_launch 2020-04-09 12:38:06 -04:00
Keith Grant
af18aa8456 restructure 'if's in LaunchPrompt 2020-04-09 08:58:12 -07:00
mabashian
188b23e88f No need to pass undefined explicitly. view will be undefined if it's not passed 2020-04-09 10:28:25 -04:00
mabashian
63bed7a30d Fixes issue where job type redirects weren't firing correctly 2020-04-09 10:28:25 -04:00
AlanCoding
fd93964953 Changed status tweaks for API validation and encryption
API validation topic:
 - do not set changed=True if the object did not actually change
 - deals with cases where API manipulates data before saving

Warn if encrypted data prevent accurate changed status

Handle false changed case of tower_user password
  password field not present in data

Test changed=True warning with JT/WFJT survey spec defaults
  case for list data in JSON
2020-04-09 09:58:12 -04:00
chris meyers
1f9f86974a test analytics table output
* unified_jobs output should include derived jobs i.e. project update,
inventory update, job
* This PR adds a test to ensure that.
2020-04-09 15:20:27 +02:00
Ladislav Smola
6a86af5b43 Use indexed timestamps
Use created and finished, which are indexed, to try to fetch all
states of jobs. If job is not finished, we might not get the
right terminal status, but that should be ok for now.
2020-04-09 15:20:27 +02:00
Ladislav Smola
6a503e152a Send also workflows as part of unified jobs
Workflows do not have a record in main_job, therefore the JOIN
was ignoring those. We need to do LEFT JOIN to include also
workflows.

It also seems like we are not able to get a link to organizations
from workflows? When looking at:
<tower_url>#/organizations?organization_search=page_size:20;order_by:name

We don't seem to list a relation to workflows. Is it possible to get it from
somewhere?
2020-04-09 15:20:27 +02:00
Ladislav Smola
b7227113be Use modified to check if job should be sent to analytics
It can take several hours for a job to go from pending to
successful/failed state and we need to also send the job with
a changed state, otherwise the analytics will be incorrect.
2020-04-09 15:20:27 +02:00
softwarefactory-project-zuul[bot]
907da2ae61 Merge pull request #6660 from mabashian/6606-jt-launch
Pass empty params to launch endpoint rather than null

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 23:46:48 +00:00
Keith Grant
6f76b15d92 fix LaunchButton tests 2020-04-08 15:36:45 -07:00
mabashian
9d6fbd6c78 Updates launch button tests to reflect passing empty object rather than null for launch payload without prompts 2020-04-08 16:10:02 -04:00
mabashian
edb4dac652 Pass empty params to launch endpoint rather than null to alleviate 400 error when launching a JT with default creds. 2020-04-08 16:10:02 -04:00
Keith Grant
42898b94e2 add more prompt tests 2020-04-08 11:48:11 -07:00
softwarefactory-project-zuul[bot]
943543354a Merge pull request #6643 from mabashian/upgrade-pf-2.39.15
Upgrades pf deps to latest

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 18:11:04 +00:00
softwarefactory-project-zuul[bot]
2da22ccd8a Merge pull request #6659 from shanemcd/pre-tty
Enable tty in dev container

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 16:35:37 +00:00
Keith Grant
9cab5a5046 add 'other prompt' fields to launch API call 2020-04-08 08:58:14 -07:00
softwarefactory-project-zuul[bot]
e270a692b7 Merge pull request #6644 from jakemcdermott/6638-fix-initial-playbook-value
Only clear playbook when different project is selected

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 15:45:27 +00:00
Shane McDonald
677a8dae7b Enable tty in dev container
Pretty colors and real-time migration logs
2020-04-08 11:43:30 -04:00
John Mitchell
6eeb32a447 excise withRouter from function components 2020-04-08 10:59:57 -04:00
softwarefactory-project-zuul[bot]
e57991d498 Merge pull request #6652 from matburt/update_zome_docz
Update some contributing docs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 14:58:40 +00:00
softwarefactory-project-zuul[bot]
4242bd55c2 Merge pull request #6639 from mabashian/route-render-prop
Converts most of our route render prop usage to children

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 14:49:42 +00:00
softwarefactory-project-zuul[bot]
e8fb466f0f Merge pull request #6646 from beeankha/credential_module_no_log
Activate no_log for Values in input Parameter of tower_credential Module

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 14:18:08 +00:00
nixocio
869fcbf483 Rename SCM to Source Control
Rename `SCM` references to `Source Control`.
Also update tests to reflect this change.

closes: https://github.com/ansible/awx/issues/5820
2020-04-08 10:10:07 -04:00
Matthew Jones
6abeaf2c55 Update some contributing docs
* Update the tools called in the dev environment
* More RMQ purges from architecture docs
* Remove the old clusterdev target
2020-04-08 10:03:22 -04:00
mabashian
f734918d3e Removes withRouter from breadcrumbs in favor of hooks 2020-04-08 09:48:16 -04:00
softwarefactory-project-zuul[bot]
91f2e0c32b Merge pull request #6605 from ansible/firehose_pkey
update firehose script for bigint migration

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 13:33:19 +00:00
softwarefactory-project-zuul[bot]
88d6dd96fa Merge pull request #6645 from ryanpetrello/some-more-iso-cleanup
more ansible runner isolated cleanup

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 13:19:06 +00:00
mabashian
7feac5ecd6 Updates routes in breadcrumbs so they no longer use the render prop 2020-04-08 09:17:46 -04:00
mabashian
193ec21149 Converts most of our route render prop usage to children 2020-04-08 09:17:46 -04:00
mabashian
14e62057da Fix linting error by not using index in key 2020-04-08 09:12:32 -04:00
softwarefactory-project-zuul[bot]
a26c0dfb8a Merge pull request #6629 from AlanCoding/one_token_to_rule_them_all
Document and align the env var for OAuth token

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-08 06:09:29 +00:00
Ryan Petrello
6b4219badb more ansible runner isolated cleanup
follow-up to https://github.com/ansible/awx/pull/6296
2020-04-08 01:18:05 -04:00
beeankha
1f598e1b12 Activate no_log for values in input parameter 2020-04-07 20:34:54 -04:00
softwarefactory-project-zuul[bot]
7ddd4d74c0 Merge pull request #6625 from marshmalien/jt-form-bugs
Fix JT form playbook select error message and more 

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 22:55:30 +00:00
softwarefactory-project-zuul[bot]
6ad6f48ff0 Merge pull request #6642 from jakemcdermott/update-contrib-doc
Add note to docs about async form behavior

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 22:32:48 +00:00
Jake McDermott
d736adbedc Only clear playbook when different project is selected 2020-04-07 18:12:48 -04:00
mabashian
c881762c97 Upgrades pf deps to latest 2020-04-07 18:07:47 -04:00
Jake McDermott
be5d067148 Add note to docs about async form behavior 2020-04-07 17:30:03 -04:00
Marliana Lara
189a10e35a Fix playbook error message and JT save bug 2020-04-07 17:01:53 -04:00
softwarefactory-project-zuul[bot]
285e9c2f62 Merge pull request #6635 from AlanCoding/no_tower_cli
Remove tower-cli from Zuul CI for AWX collection

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 20:46:45 +00:00
softwarefactory-project-zuul[bot]
054de87f8e Merge pull request #6601 from shanemcd/dont-delete-my-db
Use a docker volume for the dev env db

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 20:00:57 +00:00
softwarefactory-project-zuul[bot]
7de8a8700c Merge pull request #6487 from lj020326/devel
fix for CSRF issue in traefik configuration 

Reviewed-by: Shane McDonald <me@shanemcd.com>
             https://github.com/shanemcd
2020-04-07 20:00:51 +00:00
softwarefactory-project-zuul[bot]
4f7669dec1 Merge pull request #6634 from AlanCoding/silence
Silence deprecation warnings from tower_credential

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 19:30:45 +00:00
softwarefactory-project-zuul[bot]
25a1bc7a33 Merge pull request #6631 from ryanpetrello/refresh-token-expiry
properly respect REFRESH_TOKEN_EXPIRE_SECONDS when generating new tokens

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 18:28:26 +00:00
softwarefactory-project-zuul[bot]
955ef3e9cb Merge pull request #6541 from AlanCoding/jt_org_left_behind
Fix RBAC loose items from reversed decision on JT org permissions

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 17:41:31 +00:00
AlanCoding
0e8f2307fc Remove tower-cli from Zuul CI for AWX collection 2020-04-07 13:31:06 -04:00
AlanCoding
bcfd2d6aa4 Silence deprecation warnings from tower_credential 2020-04-07 13:24:34 -04:00
Shane McDonald
7e52f4682c Use a docker volume for the dev env db 2020-04-07 13:14:19 -04:00
Keith Grant
9c218fa5f5 flush out prompt misc fields 2020-04-07 09:41:45 -07:00
softwarefactory-project-zuul[bot]
508aed67de Merge pull request #6624 from marshmalien/6608-project-lookup-bug
Prevent project lookup from firing requests on every render

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 15:53:07 +00:00
Ryan Petrello
0bf1116ef8 properly respect REFRESH_TOKEN_EXPIRE_SECONDS when generating new tokens
see: https://github.com/ansible/awx/issues/6630
see: https://github.com/jazzband/django-oauth-toolkit/issues/746
2020-04-07 11:34:01 -04:00
AlanCoding
45df5ba9c4 Manually document tower host default 2020-04-07 10:18:55 -04:00
AlanCoding
b90a296d41 Document and align the env var for OAuth token 2020-04-07 10:00:02 -04:00
softwarefactory-project-zuul[bot]
d40143a63d Merge pull request #6607 from ryanpetrello/graphite-no-tags
don't send tags to the Grafana annotations API if none are specified

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 06:07:58 +00:00
softwarefactory-project-zuul[bot]
db40d550be Merge pull request #6472 from AlanCoding/no_required
Remove field properties which are default values anyway

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 02:57:19 +00:00
AlanCoding
da661e45ae Remove unnecessary module parameters
remove cases of required=False, the default
remove str type specifier which, the default
remove supports check mode, not changeable
2020-04-06 22:08:41 -04:00
softwarefactory-project-zuul[bot]
58160b9eb4 Merge pull request #6623 from nixocio/ui_issue_6133
Update "Enable Webhooks" option in WFJT

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-07 01:05:03 +00:00
softwarefactory-project-zuul[bot]
05b28efd9c Merge pull request #6617 from chrismeyersfsu/fix-memcached
fix memcached in dev env

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 23:49:04 +00:00
softwarefactory-project-zuul[bot]
0b433ebb1c Merge pull request #6609 from beeankha/wfjt_module_inventory_fix
Resolve Name to ID Properly in WFJT Module

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 23:42:34 +00:00
softwarefactory-project-zuul[bot]
5b3f5bf37d Merge pull request #6559 from jlmitch5/newNewAssocDisassocHostGroupsList
association and disassociation of host groups and inventory host groups list.

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 23:07:19 +00:00
softwarefactory-project-zuul[bot]
397c0092a0 Merge pull request #6569 from ryanpetrello/log-decimal
properly serialize external logs that contain decimal.Decimal objects

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 23:07:15 +00:00
softwarefactory-project-zuul[bot]
362fdaeecc Merge pull request #6604 from jakemcdermott/remove-state-checks-from-user-test
Remove state checks from user list test

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 23:07:11 +00:00
softwarefactory-project-zuul[bot]
606c3c3595 Merge pull request #6338 from rooftopcellist/update_logstash_docs
Update logstash docs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 21:10:44 +00:00
softwarefactory-project-zuul[bot]
42705c9eb0 Merge pull request #6545 from fosterseth/fix-4198-readd-user-to-org
Fix adding orphaned user to org

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 21:10:38 +00:00
Marliana Lara
c2ba495824 Prevent project lookup from firing requests on every render 2020-04-06 16:50:10 -04:00
nixocio
85a1c88653 Update "Enable Webhooks" option in WFJT
closes: https://github.com/ansible/awx/issues/6163
2020-04-06 16:48:18 -04:00
chris meyers
c4d704bee1 fix memcached in dev env
* create memcached dir via git so that the current user owns it.
Otherwise, docker will create the dir as root at runtime
2020-04-06 16:35:52 -04:00
softwarefactory-project-zuul[bot]
60d499e11c Merge pull request #6495 from john-westcott-iv/tower_credential_update_new
Initial conversion of tower_credential

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 20:29:44 +00:00
softwarefactory-project-zuul[bot]
bb48ef40be Merge pull request #6595 from nixocio/ui_docs_minor_update
Minor update UI docs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 20:26:09 +00:00
Ryan Petrello
771ca2400a don't send tags to the Grafana annotations API if none are specified
see: https://github.com/ansible/awx/issues/6580
2020-04-06 15:47:48 -04:00
softwarefactory-project-zuul[bot]
735d44816b Merge pull request #6592 from kdelee/awxkit_wfjtn_identifier
make awxkit pass through identifier for wfjtn

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 19:42:10 +00:00
beeankha
e346493921 Add inventory param to the wfjt module test playbook 2020-04-06 15:21:57 -04:00
beeankha
bd39fab17a Resolve name to ID correctly in workflow jt module 2020-04-06 15:08:01 -04:00
John Mitchell
ce30594b30 update inventory and host routes to being child-based instead of render prop based 2020-04-06 15:05:11 -04:00
John Mitchell
2021c2a596 remove unnecessary eslint ignore comics, replace react router use with hooks where possible in inventories 2020-04-06 14:38:33 -04:00
John Mitchell
ecd1d09c9a add breadcrumb config for inv host facts and groups 2020-04-06 14:38:33 -04:00
John Mitchell
7dbde8d82c fix linting errors and add note to host groups disassocation modal 2020-04-06 14:38:33 -04:00
John Mitchell
4e64b17712 update hosts groups api GET to all_groups 2020-04-06 14:38:33 -04:00
John Mitchell
cc4c514103 add association and disassociation of groups on invhostgroups/hostgroups lists 2020-04-06 14:38:33 -04:00
John Mitchell
ab8726dafa move associate modal and disassociate button up to components for use across screens 2020-04-06 14:38:33 -04:00
Ryan Petrello
2cefba6f96 properly serialize external logs that contain decimal.Decimal objects 2020-04-06 14:24:24 -04:00
softwarefactory-project-zuul[bot]
592043fa70 Merge pull request #6588 from ryanpetrello/400-error-creds
fix a typo in the credentials UI

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 18:00:43 +00:00
Mat Wilson
59477aa221 update firehose script for bigint migration 2020-04-06 10:54:08 -07:00
Jake McDermott
279fe53837 Remove state checks from user list test
Don't check for loading in UserList test
2020-04-06 13:40:31 -04:00
Shane McDonald
bb319136e4 Merge pull request #6585 from shanemcd/cleanup-cleanup
Tidy up the dev environment a bit
2020-04-06 13:09:39 -04:00
beeankha
b0f68d97da Update comment in test playbook: 2020-04-06 12:38:46 -04:00
softwarefactory-project-zuul[bot]
a46462eede Merge pull request #6526 from chrismeyersfsu/feature-memcached_socket_devel
Feature memcached socket devel

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 16:24:42 +00:00
softwarefactory-project-zuul[bot]
646e403fbd Merge pull request #6570 from marshmalien/6530-wf-node-inv-src-details
Add Inventory Source workflow node prompt details

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 16:10:33 +00:00
nixocio
64c846cfc1 Minor update UI docs
Fix typos
Highlight code sessions
2020-04-06 11:36:41 -04:00
Elijah DeLee
8e07269738 make awxkit pass through identifier for wfjtn
We need this to be able to create workflow job template nodes with identifier
2020-04-06 11:26:56 -04:00
Shane McDonald
6fc815937b Tidy up the dev environment a bit 2020-04-06 11:13:51 -04:00
Ryan Petrello
014c995a8f fix a typo in the credentials UI
this is causing 400 level errors for some users
2020-04-06 10:45:33 -04:00
John Westcott IV
c1bb62cc36 Removing recursive check, allowwing old pattern to commence 2020-04-06 10:11:18 -04:00
beeankha
f5cf7c204f Update unit test, edit credential module to pass sanity tests 2020-04-06 10:11:18 -04:00
John Westcott IV
6d08e21511 Resolving comment and updating tests 2020-04-06 10:11:18 -04:00
John Westcott IV
8b881d195d Change lookup to include organization 2020-04-06 10:11:18 -04:00
John Westcott IV
5c9ff51248 Change compare_fields to static method 2020-04-06 10:11:18 -04:00
AlanCoding
3f64768ba8 loosen some credential test assertions 2020-04-06 10:11:18 -04:00
John Westcott IV
fd24918ba8 Initial conversion of tower_credential 2020-04-06 10:11:18 -04:00
softwarefactory-project-zuul[bot]
f04e7067e8 Merge pull request #6582 from chrismeyersfsu/fix-redis_startup
align with openshift

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-06 13:52:08 +00:00
softwarefactory-project-zuul[bot]
9a91c0bfb2 Merge pull request #6572 from AlanCoding/approval_identifier
Allow setting identifier for approval nodes

Reviewed-by: Bianca Henderson <beeankha@gmail.com>
             https://github.com/beeankha
2020-04-06 13:39:39 +00:00
chris meyers
c06188da56 align with openshift 2020-04-06 09:16:46 -04:00
chris meyers
7433aab258 switch memcached from tcp to unix domain socket 2020-04-06 08:35:12 -04:00
chris meyers
37a715c680 use memcached unix domain socket rather than tcp 2020-04-06 08:35:12 -04:00
chris meyers
3d9eb3b600 align with openshift 2020-04-05 20:07:15 -04:00
softwarefactory-project-zuul[bot]
99511de728 Merge pull request #6554 from wenottingham/this-may-be-what-alan-suggested
Allow disassociating orphaned users from credentials

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 21:26:49 +00:00
softwarefactory-project-zuul[bot]
82b1b85fa4 Merge pull request #6421 from AlexSCorey/6183-SurveyPreview
Adds Survey Preview Functionality

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 21:20:41 +00:00
softwarefactory-project-zuul[bot]
2aa29420ee Merge pull request #6565 from chrismeyersfsu/fix-schema_workflow_identifier
static identifier in OPTIONS response for workflow job template node

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 21:20:37 +00:00
softwarefactory-project-zuul[bot]
9e331fe029 Merge pull request #6567 from mabashian/6531-approval-drawer-item-id
Adds workflow job id to header of approval drawer items

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 21:14:53 +00:00
softwarefactory-project-zuul[bot]
591cdb6015 Merge pull request #6566 from mabashian/4227-wf-template-row-rbac
Fix bug where JT is disabled in workflow node form for user with execute permissions on said JT

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 20:56:25 +00:00
softwarefactory-project-zuul[bot]
bc244b3600 Merge pull request #6564 from dsesami/column-type-name-change
Changed column label for plain jobs to "Playbook Run" to align with search

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 20:38:20 +00:00
AlanCoding
dbe3863b04 Allow setting identifier for approval nodes 2020-04-03 15:33:57 -04:00
Marliana Lara
ae021c37e3 Add inventory source prompt details 2020-04-03 14:56:20 -04:00
Keith Grant
8baa9d8458 clean up launch prompt credentials, display errors 2020-04-03 11:47:06 -07:00
Daniel Sami
3c888475a5 Changed displayed type name of plain jobs
updated and added i18n

removed import

prettier
2020-04-03 14:35:09 -04:00
softwarefactory-project-zuul[bot]
29b567d6e1 Merge pull request #6550 from ryanpetrello/fix-minutely-hourly
remove the limitation on (very) old DTSTART values for schedules

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 18:32:15 +00:00
softwarefactory-project-zuul[bot]
00aa1ad295 Merge pull request #6553 from ryanpetrello/remove-manual-inv-source-for-good
remove deprecated manual inventory source support

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 18:09:36 +00:00
Bill Nottingham
4f3213715e Allow disassociating any user from a credential role.
This is preventing removing roles from users no longer in the organization.
2020-04-03 13:39:28 -04:00
mabashian
0389e72197 Adds workflow job id to approval header link to match up with what's displayed on the jobs list 2020-04-03 13:39:06 -04:00
mabashian
0732795ecc Rows in the wfjt node form templates list should only be disabled if the user cannot start the job. 2020-04-03 13:27:28 -04:00
chris meyers
a26df3135b static identifier in docs
* OPTIONS response descritpion for workflow job template node identifier
value was an ever changing uuid4(). This is telling the user the wrong
thing. We can not know what uuid4() is going to be in the docs. Instead,
for the OPTIONS response description, tell the user the form that the
uuid4() takes, ie. xxx-xxxx...
* Note that the API browser still populates a uuid4 for the user when it
generates the sample POST data. This is nice.
2020-04-03 13:12:49 -04:00
softwarefactory-project-zuul[bot]
a904aea519 Merge pull request #6551 from chrismeyersfsu/fix-nonce_replay_timestamp
simplify nonce creation and extraction

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 16:32:09 +00:00
Ryan Petrello
6bd5053ae8 remove the limitation on (very) old DTSTART values for schedules 2020-04-03 10:59:35 -04:00
Ryan Petrello
8b00b8c9c2 remove deprecated legacy manual inventory source support
see: https://github.com/ansible/awx/issues/6309
2020-04-03 10:54:43 -04:00
softwarefactory-project-zuul[bot]
2b9acd78c8 Merge pull request #6522 from chrismeyersfsu/feature-wsbroadcast_status
add broadcast websocket status command

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 02:25:25 +00:00
chris meyers
d7f0642f48 add ws broadcast status to sos report 2020-04-02 21:46:12 -04:00
chris meyers
8bbae0cc3a color output of ws broadcast connection status 2020-04-02 21:46:12 -04:00
chris meyers
c00f1505d7 add broadcast websocket status command 2020-04-02 21:46:12 -04:00
softwarefactory-project-zuul[bot]
a08e6691fb Merge pull request #6266 from rooftopcellist/configmap_container_files
ConfigMap supervisor configs and launch scripts for k8s

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 01:07:27 +00:00
softwarefactory-project-zuul[bot]
98bc499498 Merge pull request #6468 from jlmitch5/hostGroupsList
add inventory host groups list and host groups lists

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-03 00:46:33 +00:00
chris meyers
6d0c42a91a align with configmap changes 2020-04-02 20:05:26 -04:00
chris meyers
79c5a62279 simplify nonce creation and extraction
* time() library supports leap seconds also
2020-04-02 19:57:50 -04:00
softwarefactory-project-zuul[bot]
3bb671f3f2 Merge pull request #6497 from john-westcott-iv/tower_notification_update
Initial conversion of tower_notification

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 23:29:57 +00:00
Keith Grant
0b9c5c410a add credential select list to launch CredentialsStep 2020-04-02 16:29:40 -07:00
softwarefactory-project-zuul[bot]
d77d5a7734 Merge pull request #6548 from marshmalien/5636-translate-login
Mark login button for translation

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 23:20:37 +00:00
Keith Grant
0a00a3104a add CredentialTypesAPI.loadAllTypes helper function 2020-04-02 13:55:30 -07:00
John Mitchell
ab36129395 add inventory host groups list and host groups lists 2020-04-02 15:02:41 -04:00
AlanCoding
e99500cf16 Mark test as xfail, move to unit testing 2020-04-02 14:48:33 -04:00
softwarefactory-project-zuul[bot]
299497ea12 Merge pull request #6490 from marshmalien/5997-wf-view-node
Hook up view node button in workflow visualizer

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 18:27:09 +00:00
Seth Foster
843c22c6b1 Allow orphaned user to be added to org
Fixed bug where an org admin was not able to add
an orphaned user to the org, in the case where the
orphan had an ancestor role that matched one of the
roles for of the org admin.

scenario to fix -- sue is member of cred1, where cred1 is
part of org1. org1 admin cannot add sue to org1, because
the cred1 role for sue has an ancestor to org1 role. The org1
admin cannot change or attach sue to org1.

tower issue #4198 and #4197
2020-04-02 14:24:55 -04:00
Marliana Lara
86b49b6fe2 Mark login button for translation 2020-04-02 14:19:13 -04:00
Christian Adams
9489f00ca4 Align k8 and ocp supervisor scripts
- Handle scl enable calls for python processes that use postgresql
 - Handle ocp specific vars better
2020-04-02 13:56:33 -04:00
chris meyers
6d60e7dadc align with openshift 2020-04-02 13:56:33 -04:00
Christian Adams
346b9b9e3e ConfigMap supervisor configs and launch scripts for k8s 2020-04-02 13:56:33 -04:00
softwarefactory-project-zuul[bot]
99384b1db9 Merge pull request #6506 from shanemcd/stateless-set
Switch from StatefulSet to Deployment

Reviewed-by: Matthew Jones <mat@matburt.net>
             https://github.com/matburt
2020-04-02 17:51:25 +00:00
Marliana Lara
d1b5a60bb9 Add project node details 2020-04-02 13:09:24 -04:00
Shane McDonald
d57258878d Update more references to statefulset 2020-04-02 12:44:26 -04:00
softwarefactory-project-zuul[bot]
48414f6dab Merge pull request #6542 from chrismeyersfsu/fix-align_redis
Fix align redis

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 16:22:31 +00:00
Shane McDonald
ff0186f72b Delete k8s StatefulSet if it exists (for upgrades) 2020-04-02 12:21:35 -04:00
softwarefactory-project-zuul[bot]
a682565758 Merge pull request #6385 from AlexSCorey/6317-ConvertJTFormstoFormikHooks
Uses formik hooks for JT Form

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 15:52:35 +00:00
softwarefactory-project-zuul[bot]
0dee2e5973 Merge pull request #6482 from AlexSCorey/5901-SupportForWFJTSurvey
Adds Survey Functionality to WFJT

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 15:19:21 +00:00
chris meyers
929f4bfb81 start redis container with conf file 2020-04-02 11:13:35 -04:00
AlanCoding
ac474e2108 Fix RBAC loose items from reversed decision on JT org permissions 2020-04-02 10:17:04 -04:00
Alex Corey
d6722c2106 Adds tests for Survey Preview functionality 2020-04-02 10:02:35 -04:00
Alex Corey
6eef0b82bd Adds Survey Preview Functionality 2020-04-02 10:02:35 -04:00
Alex Corey
fb4343d75e Removes uncessary formikContext items in favor of useField.
Removed OrgId value from formik and get that value from project field
Updates tests and type.js to reflect those changes.
2020-04-02 09:31:35 -04:00
Alex Corey
a867a32b4e Uses formik hooks for JT Form 2020-04-02 09:30:12 -04:00
Shane McDonald
3060505110 Switch from StatefulSet to Deployment
We can do this now that we dropped RabbitMQ.
2020-04-02 09:24:49 -04:00
beeankha
5d68f796aa Rebase + fix typos 2020-04-02 09:21:33 -04:00
AlanCoding
15036ff970 Add unit tests for notification module 2020-04-02 09:14:50 -04:00
John Westcott IV
32783f7aaf Fixing linting errors 2020-04-02 09:14:50 -04:00
John Westcott IV
8699a8fbc2 Resolving comments on PR
Made notification type optional

Fixed examples to use notification_configuration

Fixed defaults for headers to prevent deprication warning

Removed default on messages
2020-04-02 09:14:49 -04:00
John Westcott IV
b4cde80fa9 Updating example to match test 2020-04-02 09:14:49 -04:00
John Westcott IV
eb4db4ed43 Adding field change to readme and example and test of custom messages 2020-04-02 09:14:49 -04:00
John Westcott IV
649aafb454 Initial conversion of tower_notification 2020-04-02 09:14:49 -04:00
softwarefactory-project-zuul[bot]
b6c272e946 Merge pull request #6525 from ryanpetrello/bye-bye-activity-stream-middleware
get rid of the activity stream middleware

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-02 05:25:24 +00:00
Ryan Petrello
9fe2211f82 get rid of the activity stream middleware
it has bugs and is very confusing

see: https://github.com/ansible/tower/issues/4037
2020-04-01 16:02:42 -04:00
Marliana Lara
4704e24c24 Fetch full resource object and replace the matching node 2020-04-01 15:21:42 -04:00
softwarefactory-project-zuul[bot]
e5f293ce52 Merge pull request #6486 from keithjgrant/5909-jt-launch-prompt
JT Launch Prompting (phase 1)

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-01 18:29:27 +00:00
softwarefactory-project-zuul[bot]
d64b898390 Merge pull request #6491 from john-westcott-iv/second_tower_job_template_update
Second attempt at converting tower_job_template

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-01 14:09:22 +00:00
softwarefactory-project-zuul[bot]
498c525b34 Merge pull request #6513 from SebastianThorn/devel
[DOC] Adds comment about needing to be a pem-file

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-01 13:22:03 +00:00
beeankha
bb184f8ffb Update booleans to pass linter 2020-04-01 08:58:28 -04:00
softwarefactory-project-zuul[bot]
7f537dbedf Merge pull request #6515 from ryanpetrello/cleanup-some-more-redis
remove some unused code from the redis rewrite

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-04-01 12:38:48 +00:00
Ryan Petrello
f9b8a69f7b remove some unused code from the redis rewrite 2020-04-01 08:03:59 -04:00
Sebastian Thörn
bc228b8d77 Adds comment about needing to be a pem-file
This needs to be a .pem-file
2020-04-01 11:54:07 +02:00
Keith Grant
7710ad2e57 move OptionsList to components; add launch prompt tests 2020-03-31 13:59:14 -07:00
beeankha
9f2c9b13d7 Update unit test, extra_vars handling, and edit README 2020-03-31 16:16:11 -04:00
softwarefactory-project-zuul[bot]
6940704deb Merge pull request #6509 from ryanpetrello/twisted-cves
update to the latest twisted to address two open CVEs

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 19:59:11 +00:00
softwarefactory-project-zuul[bot]
6b9cacb85f Merge pull request #6508 from ryanpetrello/django-extensions-bump
bump django-extensions version to address a bug in shell_plus

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 19:16:49 +00:00
softwarefactory-project-zuul[bot]
cfa0fdaa12 Merge pull request #6337 from rebeccahhh/activity-stream-grab-bag
add in summary fields to activity stream logging output

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 19:15:51 +00:00
Ryan Petrello
4423e6edae update to the latest twisted to address two open CVEs 2020-03-31 13:47:56 -04:00
softwarefactory-project-zuul[bot]
13faa0ed2e Merge pull request #6489 from wenottingham/that-ain't-right
Don't return different fields for smart vs non-smart inventories

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 17:43:03 +00:00
Ryan Petrello
42336355bb bump django-extensions version to address a bug in shell_plus
see: https://github.com/ansible/awx/pull/6441
see: e8d5daa06e
2020-03-31 13:39:13 -04:00
Marliana Lara
c18aa90534 Add timeout detail to node view modal 2020-03-31 13:39:05 -04:00
softwarefactory-project-zuul[bot]
39460fb3d3 Merge pull request #6505 from squidboylan/tower_group_integration_tests
Collection: add tower_group child group tests

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 17:31:00 +00:00
Keith Grant
4f51c1d2c9 fix LaunchButton tests 2020-03-31 10:09:33 -07:00
Caleb Boylan
04ccff0e3f Collection: add tower_group child group tests 2020-03-31 09:43:53 -07:00
softwarefactory-project-zuul[bot]
2242119182 Merge pull request #6419 from mabashian/5864-schedule-add-2
Implement schedule add form on JT/WFJT/Proj

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 16:06:46 +00:00
Marliana Lara
5cba34c34d Use styles to add prompt header spacing 2020-03-31 12:05:13 -04:00
mabashian
33a699b8ae Display form errors on new lines if there are multiple 2020-03-31 10:57:30 -04:00
softwarefactory-project-zuul[bot]
344a4bb238 Merge pull request #6494 from ryanpetrello/quiter-pg-migration
detect event migration tables in a less noisy way

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 14:30:15 +00:00
softwarefactory-project-zuul[bot]
0beda08cf9 Merge pull request #6471 from megabreit/jinja2-installer-fix
support for older jinja2 in installer #5501

Reviewed-by: Ryan Petrello
             https://github.com/ryanpetrello
2020-03-31 14:10:21 +00:00
softwarefactory-project-zuul[bot]
2264a98c04 Merge pull request #6455 from AlanCoding/auth_errors
Improve error handling related to authentication

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-31 11:28:37 +00:00
Ryan Petrello
d19a9db523 detect event migration tables in a less noisy way
see: https://github.com/ansible/awx/issues/6493
2020-03-31 00:05:30 -04:00
John Westcott IV
4b76332daf Added notification of removal of extra_vars_path 2020-03-30 23:35:11 -04:00
John Westcott IV
db38339179 Second attempt at converting tower_job_template 2020-03-30 23:35:11 -04:00
softwarefactory-project-zuul[bot]
5eddcdd5f5 Merge pull request #6484 from ryanpetrello/inv-source-required
prevent manual updates at POST /api/v2/inventory_sources/N/update/

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-30 23:46:13 +00:00
softwarefactory-project-zuul[bot]
3480d2da59 Merge pull request #6488 from ryanpetrello/galaxy-role-host-key-checking
disable host key checking when installing galaxy roles/collections

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-30 22:06:52 +00:00
Keith Grant
e60e6c7d08 pass prompted params through to launch API request 2020-03-30 14:39:16 -07:00
Keith Grant
55356ebb51 set default values on prompts 2020-03-30 14:37:07 -07:00
Keith Grant
7f4bbbe5c5 add launch prompt inventory step 2020-03-30 14:37:07 -07:00
Keith Grant
49b1ce6e8c add skeleton of launch prompt wizard 2020-03-30 14:37:07 -07:00
Marliana Lara
caaefef900 Add modal to show a preview of node prompt values 2020-03-30 17:31:50 -04:00
Bill Nottingham
96576b0e3d Don't return different fields for smart vs non-smart inventories 2020-03-30 17:15:55 -04:00
mabashian
288ce123ca Adds resources_needed_to_start to the list of keys for error message handling 2020-03-30 17:04:17 -04:00
Ryan Petrello
140dbbaa7d disable host key checking when installing galaxy roles/collections
see: https://github.com/ansible/awx/issues/5947
2020-03-30 17:03:14 -04:00
softwarefactory-project-zuul[bot]
e9d11be680 Merge pull request #6104 from mabashian/6086-proj-form-org
Fixes issues with organization when saving the project form.

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-30 20:47:12 +00:00
softwarefactory-project-zuul[bot]
d7f117e83f Merge pull request #6448 from AlexSCorey/6446-HostAdd
Fixes HostAdd Form layout issue

Reviewed-by: https://github.com/apps/softwarefactory-project-zuul
2020-03-30 20:47:08 +00:00
lj020326
eef1246e0b Merge pull request #1 from lj020326/lj020326-patch-1
Update settings.py to resolve CSRF issue in traefik configuration
2020-03-30 16:29:06 -04:00
lj020326
65e38aa37d Update settings.py
This is needed for LB (e.g., traefik) for proxying into nginx
otherwise, get CSRF error
ref: https://stackoverflow.com/questions/27533011/django-csrf-error-casused-by-nginx-x-forwarded-host

resolved by adding USE_X_FORWARDED_HOST using the following similar issue as a reference:
https://github.com/catmaid/CATMAID/issues/1781
2020-03-30 16:27:40 -04:00
mabashian
c7b23aac9b Removes static run on string options and opts for the more dynamic ux pattern already adopted in the old UI 2020-03-30 15:37:56 -04:00
mabashian
b4ea60eb79 Fixes issue where repeat frequency was not displaying correctly for schedules that only run once 2020-03-30 15:37:56 -04:00
mabashian
24c738c6d8 Moves generation of today and tomorrow strings out of the return of the ScheduleForm 2020-03-30 15:37:56 -04:00
mabashian
0c26734d7d Move the construction of the rule object out to it's own function 2020-03-30 15:37:56 -04:00
mabashian
d9b613ccb3 Implement schedule add form on JT/WFJT/Proj 2020-03-30 15:37:56 -04:00
Ryan Petrello
831bf9124f prevent manual updates at POST /api/v2/inventory_sources/N/update/
see: https://github.com/ansible/awx/issues/6309
2020-03-30 15:35:04 -04:00
Alex Corey
0b31cad2db Adds Survey Functionality to WFJT 2020-03-30 14:20:44 -04:00
AlanCoding
059e744774 Address errors with login and logout in python2
Addresses scenarios when username and password
  were used and collection obtained token

Fix error sendall() arg 1 must be string or buffer

Improve error handling related to authentication
  clear the query after request and before logout
  put response data in error in both cases
2020-03-30 13:48:14 -04:00
Armin Kunaschik
2b3c57755c support for older jinja2 in installer 2020-03-28 02:59:40 +01:00
Alex Corey
0eb526919f Fixes HostAdd Form layout issue 2020-03-26 15:23:31 -04:00
mabashian
a8f56f78e9 Remove unused touched and error props 2020-03-26 14:11:22 -04:00
mabashian
f7ad3d78eb Fixes issues with organization when saving the project form.
Changes helperTextInvalid prop type to node which matches more closely with what the upstream PF component supports.
2020-03-26 13:57:11 -04:00
Rebeccah
5bfe89be6e removed the to_representation and replaced with get_summary_fields per suggestion in PR comments 2020-03-23 14:57:07 -04:00
Rebeccah
47661fad51 added in summary fields into logging which will solve several issues related to needing more data in logging outputs 2020-03-23 14:57:07 -04:00
Christian Adams
9a38971d47 Update ELK Stack container files 2020-03-19 09:35:08 -04:00
Christian Adams
c4e697879d Improve docs for using the logstash container 2020-03-18 18:32:45 -04:00
368 changed files with 14059 additions and 5236 deletions

1
.gitignore vendored
View File

@@ -31,6 +31,7 @@ awx/ui/templates/ui/installing.html
awx/ui_next/node_modules/
awx/ui_next/coverage/
awx/ui_next/build/locales/_build
rsyslog.pid
/tower-license
/tower-license/**
tools/prometheus/data

View File

@@ -2,6 +2,20 @@
This is a list of high-level changes for each release of AWX. A full list of commits can be found at `https://github.com/ansible/awx/releases/tag/<version>`.
## 11.0.0 (Apr 16, 2020)
- As of AWX 11.0.0, Kubernetes-based deployments use a Deployment rather than a StatefulSet.
- Reimplemented external logging support using rsyslogd to improve reliability and address a number of issues (https://github.com/ansible/awx/issues/5155)
- Changed activity stream logs to include summary fields for related objects (https://github.com/ansible/awx/issues/1761)
- Added code to more gracefully attempt to reconnect to redis if it restarts/becomes unavailable (https://github.com/ansible/awx/pull/6670)
- Fixed a bug that caused REFRESH_TOKEN_EXPIRE_SECONDS to not properly be respected for OAuth2.0 refresh tokens generated by AWX (https://github.com/ansible/awx/issues/6630)
- Fixed a bug that broke schedules containing RRULES with very old DTSTART dates (https://github.com/ansible/awx/pull/6550)
- Fixed a bug that broke installs on older versions of Ansible packaged with certain Linux distributions (https://github.com/ansible/awx/issues/5501)
- Fixed a bug that caused the activity stream to sometimes report the incorrect actor when associating user membership on SAML login (https://github.com/ansible/awx/pull/6525)
- Fixed a bug in AWX's Grafana notification support when annotation tags are omitted (https://github.com/ansible/awx/issues/6580)
- Fixed a bug that prevented some users from searching for Source Control credentials in the AWX user interface (https://github.com/ansible/awx/issues/6600)
- Fixed a bug that prevented disassociating orphaned users from credentials (https://github.com/ansible/awx/pull/6554)
- Updated Twisted to address CVE-2020-10108 and CVE-2020-10109.
## 10.0.0 (Mar 30, 2020)
- As of AWX 10.0.0, the official AWX CLI no longer supports Python 2 (it requires at least Python 3.6) (https://github.com/ansible/awx/pull/6327)
- AWX no longer relies on RabbitMQ; Redis is added as a new dependency (https://github.com/ansible/awx/issues/5443)
@@ -95,7 +109,7 @@ This is a list of high-level changes for each release of AWX. A full list of com
- Fixed a bug in the CLI which incorrectly parsed launch time arguments for `awx job_templates launch` and `awx workflow_job_templates launch` (https://github.com/ansible/awx/issues/5093).
- Fixed a bug that caused inventory updates using "sourced from a project" to stop working (https://github.com/ansible/awx/issues/4750).
- Fixed a bug that caused Slack notifications to sometimes show the wrong bot avatar (https://github.com/ansible/awx/pull/5125).
- Fixed a bug that prevented the use of digits in Tower's URL settings (https://github.com/ansible/awx/issues/5081).
- Fixed a bug that prevented the use of digits in AWX's URL settings (https://github.com/ansible/awx/issues/5081).
## 8.0.0 (Oct 21, 2019)

View File

@@ -215,18 +215,23 @@ Using `docker exec`, this will create a session in the running *awx* container,
If you want to start and use the development environment, you'll first need to bootstrap it by running the following command:
```bash
(container)# /bootstrap_development.sh
(container)# /usr/bin/bootstrap_development.sh
```
The above will do all the setup tasks, including running database migrations, so it may take a couple minutes.
The above will do all the setup tasks, including running database migrations, so it may take a couple minutes. Once it's done it
will drop you back to the shell.
Now you can start each service individually, or start all services in a pre-configured tmux session like so:
In order to launch all developer services:
```bash
(container)# cd /awx_devel
(container)# make server
(container)# /usr/bin/launch_awx.sh
```
`launch_awx.sh` also calls `bootstrap_development.sh` so if all you are doing is launching the supervisor to start all services, you don't
need to call `bootstrap_development.sh` first.
### Post Build Steps
Before you can log in and use the system, you will need to create an admin user. Optionally, you may also want to load some demo data.

View File

@@ -477,7 +477,7 @@ Before starting the install process, review the [inventory](./installer/inventor
*ssl_certificate*
> Optionally, provide the path to a file that contains a certificate and its private key.
> Optionally, provide the path to a file that contains a certificate and its private key. This needs to be a .pem-file
*docker_compose_dir*

View File

@@ -18,7 +18,6 @@ COMPOSE_TAG ?= $(GIT_BRANCH)
COMPOSE_HOST ?= $(shell hostname)
VENV_BASE ?= /venv
COLLECTION_VENV ?= /awx_devel/awx_collection_test_venv
SCL_PREFIX ?=
CELERY_SCHEDULE_FILE ?= /var/lib/awx/beat.db
@@ -365,11 +364,6 @@ test:
cd awxkit && $(VENV_BASE)/awx/bin/tox -re py2,py3
awx-manage check_migrations --dry-run --check -n 'vNNN_missing_migration_file'
prepare_collection_venv:
rm -rf $(COLLECTION_VENV)
mkdir $(COLLECTION_VENV)
$(VENV_BASE)/awx/bin/pip install --target=$(COLLECTION_VENV) git+https://github.com/ansible/tower-cli.git
COLLECTION_TEST_DIRS ?= awx_collection/test/awx
COLLECTION_TEST_TARGET ?=
COLLECTION_PACKAGE ?= awx
@@ -380,12 +374,12 @@ test_collection:
@if [ "$(VENV_BASE)" ]; then \
. $(VENV_BASE)/awx/bin/activate; \
fi; \
PYTHONPATH=$(COLLECTION_VENV):$PYTHONPATH:/usr/lib/python3.6/site-packages py.test $(COLLECTION_TEST_DIRS)
PYTHONPATH=$PYTHONPATH:/usr/lib/python3.6/site-packages py.test $(COLLECTION_TEST_DIRS)
flake8_collection:
flake8 awx_collection/ # Different settings, in main exclude list
test_collection_all: prepare_collection_venv test_collection flake8_collection
test_collection_all: test_collection flake8_collection
# WARNING: symlinking a collection is fundamentally unstable
# this is for rapid development iteration with playbooks, do not use with other test targets
@@ -668,11 +662,12 @@ docker-compose-isolated-build: awx-devel-build
docker tag ansible/awx_isolated $(DEV_DOCKER_TAG_BASE)/awx_isolated:$(COMPOSE_TAG)
#docker push $(DEV_DOCKER_TAG_BASE)/awx_isolated:$(COMPOSE_TAG)
MACHINE?=default
docker-clean:
eval $$(docker-machine env $(MACHINE))
$(foreach container_id,$(shell docker ps -f name=tools_awx -aq),docker stop $(container_id); docker rm -f $(container_id);)
-docker images | grep "awx_devel" | awk '{print $$1 ":" $$2}' | xargs docker rmi
docker images | grep "awx_devel" | awk '{print $$1 ":" $$2}' | xargs docker rmi
docker-clean-volumes:
docker volume rm tools_awx_db
docker-refresh: docker-clean docker-compose
@@ -686,9 +681,6 @@ docker-compose-cluster-elk: docker-auth awx/projects
prometheus:
docker run -u0 --net=tools_default --link=`docker ps | egrep -o "tools_awx(_run)?_([^ ]+)?"`:awxweb --volume `pwd`/tools/prometheus:/prometheus --name prometheus -d -p 0.0.0.0:9090:9090 prom/prometheus --web.enable-lifecycle --config.file=/prometheus/prometheus.yml
minishift-dev:
ansible-playbook -i localhost, -e devtree_directory=$(CURDIR) tools/clusterdevel/start_minishift_dev.yml
clean-elk:
docker stop tools_kibana_1
docker stop tools_logstash_1

View File

@@ -1 +1 @@
10.0.0
11.0.0

View File

@@ -2,6 +2,7 @@
# All Rights Reserved.
from collections import OrderedDict
from uuid import UUID
# Django
from django.core.exceptions import PermissionDenied
@@ -86,6 +87,8 @@ class Metadata(metadata.SimpleMetadata):
# FIXME: Still isn't showing all default values?
try:
default = field.get_default()
if type(default) is UUID:
default = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
if field.field_name == 'TOWER_URL_BASE' and default == 'https://towerhost':
default = '{}://{}'.format(self.request.scheme, self.request.get_host())
field_info['default'] = default

View File

@@ -2034,11 +2034,6 @@ class InventorySourceSerializer(UnifiedJobTemplateSerializer, InventorySourceOpt
res['credentials'] = self.reverse('api:inventory_source_credentials_list', kwargs={'pk': obj.pk})
return res
def get_group(self, obj): # TODO: remove in 3.3
if obj.deprecated_group:
return obj.deprecated_group.id
return None
def build_relational_field(self, field_name, relation_info):
field_class, field_kwargs = super(InventorySourceSerializer, self).build_relational_field(field_name, relation_info)
# SCM Project and inventory are read-only unless creating a new inventory.
@@ -3616,9 +3611,11 @@ class LaunchConfigurationBaseSerializer(BaseSerializer):
elif self.instance:
ujt = self.instance.unified_job_template
if ujt is None:
if 'workflow_job_template' in attrs:
return {'workflow_job_template': attrs['workflow_job_template']}
return {}
ret = {}
for fd in ('workflow_job_template', 'identifier'):
if fd in attrs:
ret[fd] = attrs[fd]
return ret
# build additional field survey_passwords to track redacted variables
password_dict = {}
@@ -4539,6 +4536,8 @@ class SchedulePreviewSerializer(BaseSerializer):
try:
Schedule.rrulestr(rrule_value)
except Exception as e:
import traceback
logger.error(traceback.format_exc())
raise serializers.ValidationError(_("rrule parsing failed validation: {}").format(e))
return value

View File

@@ -1 +1,2 @@
# Test Logging Configuration

View File

@@ -1,11 +1,15 @@
# Copyright (c) 2017 Ansible, Inc.
# All Rights Reserved.
from datetime import timedelta
from django.utils.timezone import now
from django.conf import settings
from django.conf.urls import url
from oauthlib import oauth2
from oauth2_provider import views
from awx.main.models import RefreshToken
from awx.api.views import (
ApiOAuthAuthorizationRootView,
)
@@ -14,6 +18,21 @@ from awx.api.views import (
class TokenView(views.TokenView):
def create_token_response(self, request):
# Django OAuth2 Toolkit has a bug whereby refresh tokens are *never*
# properly expired (ugh):
#
# https://github.com/jazzband/django-oauth-toolkit/issues/746
#
# This code detects and auto-expires them on refresh grant
# requests.
if request.POST.get('grant_type') == 'refresh_token' and 'refresh_token' in request.POST:
refresh_token = RefreshToken.objects.filter(
token=request.POST['refresh_token']
).first()
if refresh_token:
expire_seconds = settings.OAUTH2_PROVIDER.get('REFRESH_TOKEN_EXPIRE_SECONDS', 0)
if refresh_token.created + timedelta(seconds=expire_seconds) < now():
return request.build_absolute_uri(), {}, 'The refresh token has expired.', '403'
try:
return super(TokenView, self).create_token_response(request)
except oauth2.AccessDeniedError as e:

View File

@@ -1092,7 +1092,7 @@ class UserRolesList(SubListAttachDetachAPIView):
credential_content_type = ContentType.objects.get_for_model(models.Credential)
if role.content_type == credential_content_type:
if role.content_object.organization and user not in role.content_object.organization.member_role:
if 'disassociate' not in request.data and role.content_object.organization and user not in role.content_object.organization.member_role:
data = dict(msg=_("You cannot grant credential access to a user not in the credentials' organization"))
return Response(data, status=status.HTTP_400_BAD_REQUEST)
@@ -4415,7 +4415,7 @@ class RoleUsersList(SubListAttachDetachAPIView):
credential_content_type = ContentType.objects.get_for_model(models.Credential)
if role.content_type == credential_content_type:
if role.content_object.organization and user not in role.content_object.organization.member_role:
if 'disassociate' not in request.data and role.content_object.organization and user not in role.content_object.organization.member_role:
data = dict(msg=_("You cannot grant credential access to a user not in the credentials' organization"))
return Response(data, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -325,17 +325,3 @@ def test_setting_singleton_delete_no_read_only_fields(api_request, dummy_setting
)
assert response.data['FOO_BAR'] == 23
@pytest.mark.django_db
def test_setting_logging_test(api_request):
with mock.patch('awx.conf.views.AWXProxyHandler.perform_test') as mock_func:
api_request(
'post',
reverse('api:setting_logging_test'),
data={'LOG_AGGREGATOR_HOST': 'http://foobar', 'LOG_AGGREGATOR_TYPE': 'logstash'}
)
call = mock_func.call_args_list[0]
args, kwargs = call
given_settings = kwargs['custom_settings']
assert given_settings.LOG_AGGREGATOR_HOST == 'http://foobar'
assert given_settings.LOG_AGGREGATOR_TYPE == 'logstash'

View File

@@ -3,7 +3,11 @@
# Python
import collections
import logging
import subprocess
import sys
import socket
from socket import SHUT_RDWR
# Django
from django.conf import settings
@@ -11,7 +15,7 @@ from django.http import Http404
from django.utils.translation import ugettext_lazy as _
# Django REST Framework
from rest_framework.exceptions import PermissionDenied, ValidationError
from rest_framework.exceptions import PermissionDenied
from rest_framework.response import Response
from rest_framework import serializers
from rest_framework import status
@@ -26,7 +30,6 @@ from awx.api.generics import (
from awx.api.permissions import IsSuperUser
from awx.api.versioning import reverse
from awx.main.utils import camelcase_to_underscore
from awx.main.utils.handlers import AWXProxyHandler, LoggingConnectivityException
from awx.main.tasks import handle_setting_changes
from awx.conf.models import Setting
from awx.conf.serializers import SettingCategorySerializer, SettingSingletonSerializer
@@ -161,40 +164,47 @@ class SettingLoggingTest(GenericAPIView):
filter_backends = []
def post(self, request, *args, **kwargs):
defaults = dict()
for key in settings_registry.get_registered_settings(category_slug='logging'):
try:
defaults[key] = settings_registry.get_setting_field(key).get_default()
except serializers.SkipField:
defaults[key] = None
obj = type('Settings', (object,), defaults)()
serializer = self.get_serializer(obj, data=request.data)
serializer.is_valid(raise_exception=True)
# Special validation specific to logging test.
errors = {}
for key in ['LOG_AGGREGATOR_TYPE', 'LOG_AGGREGATOR_HOST']:
if not request.data.get(key, ''):
errors[key] = 'This field is required.'
if errors:
raise ValidationError(errors)
if request.data.get('LOG_AGGREGATOR_PASSWORD', '').startswith('$encrypted$'):
serializer.validated_data['LOG_AGGREGATOR_PASSWORD'] = getattr(
settings, 'LOG_AGGREGATOR_PASSWORD', ''
)
# Error if logging is not enabled
enabled = getattr(settings, 'LOG_AGGREGATOR_ENABLED', False)
if not enabled:
return Response({'error': 'Logging not enabled'}, status=status.HTTP_409_CONFLICT)
# Send test message to configured logger based on db settings
logging.getLogger('awx').error('AWX Connection Test Message')
hostname = getattr(settings, 'LOG_AGGREGATOR_HOST', None)
protocol = getattr(settings, 'LOG_AGGREGATOR_PROTOCOL', None)
try:
class MockSettings:
pass
mock_settings = MockSettings()
for k, v in serializer.validated_data.items():
setattr(mock_settings, k, v)
AWXProxyHandler().perform_test(custom_settings=mock_settings)
if mock_settings.LOG_AGGREGATOR_PROTOCOL.upper() == 'UDP':
return Response(status=status.HTTP_201_CREATED)
except LoggingConnectivityException as e:
return Response({'error': str(e)}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
return Response(status=status.HTTP_200_OK)
subprocess.check_output(
['rsyslogd', '-N1', '-f', '/var/lib/awx/rsyslog/rsyslog.conf'],
stderr=subprocess.STDOUT
)
except subprocess.CalledProcessError as exc:
return Response({'error': exc.output}, status=status.HTTP_400_BAD_REQUEST)
# Check to ensure port is open at host
if protocol in ['udp', 'tcp']:
port = getattr(settings, 'LOG_AGGREGATOR_PORT', None)
# Error if port is not set when using UDP/TCP
if not port:
return Response({'error': 'Port required for ' + protocol}, status=status.HTTP_400_BAD_REQUEST)
else:
# if http/https by this point, domain is reacheable
return Response(status=status.HTTP_202_ACCEPTED)
if protocol == 'udp':
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
else:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.settimeout(.5)
s.connect((hostname, int(port)))
s.shutdown(SHUT_RDWR)
s.close()
return Response(status=status.HTTP_202_ACCEPTED)
except Exception as e:
return Response({'error': str(e)}, status=status.HTTP_400_BAD_REQUEST)
# Create view functions for all of the class-based views to simplify inclusion

View File

@@ -11,7 +11,6 @@ from functools import reduce
from django.conf import settings
from django.db.models import Q, Prefetch
from django.contrib.auth.models import User
from django.contrib.contenttypes.models import ContentType
from django.utils.translation import ugettext_lazy as _
from django.core.exceptions import ObjectDoesNotExist
@@ -405,14 +404,6 @@ class BaseAccess(object):
# Cannot copy manual project without errors
user_capabilities[display_method] = False
continue
elif display_method in ['start', 'schedule'] and isinstance(obj, Group): # TODO: remove in 3.3
try:
if obj.deprecated_inventory_source and not obj.deprecated_inventory_source._can_update():
user_capabilities[display_method] = False
continue
except Group.deprecated_inventory_source.RelatedObjectDoesNotExist:
user_capabilities[display_method] = False
continue
elif display_method in ['start', 'schedule'] and isinstance(obj, (Project)):
if obj.scm_type == '':
user_capabilities[display_method] = False
@@ -650,8 +641,8 @@ class UserAccess(BaseAccess):
# in these cases only superusers can modify orphan users
return False
return not obj.roles.all().exclude(
content_type=ContentType.objects.get_for_model(User)
).filter(ancestors__in=self.user.roles.all()).exists()
ancestors__in=self.user.roles.all()
).exists()
else:
return self.is_all_org_admin(obj)
@@ -1434,7 +1425,7 @@ class JobTemplateAccess(NotificationAttachMixin, BaseAccess):
Users who are able to create deploy jobs can also run normal and check (dry run) jobs.
'''
if not data: # So the browseable API will work
return Organization.accessible_objects(self.user, 'job_template_admin_role').exists()
return Project.accessible_objects(self.user, 'use_role').exists()
# if reference_obj is provided, determine if it can be copied
reference_obj = data.get('reference_obj', None)
@@ -1503,11 +1494,6 @@ class JobTemplateAccess(NotificationAttachMixin, BaseAccess):
if data is None:
return True
# standard type of check for organization - cannot change the value
# unless posessing the respective job_template_admin_role, otherwise non-blocking
if not self.check_related('organization', Organization, data, obj=obj, role_field='job_template_admin_role'):
return False
data = dict(data)
if self.changes_are_non_sensitive(obj, data):

View File

@@ -11,6 +11,7 @@ from prometheus_client import (
Counter,
Enum,
CollectorRegistry,
parser,
)
from django.conf import settings
@@ -30,6 +31,11 @@ def now_seconds():
return dt_to_seconds(datetime.datetime.now())
def safe_name(s):
# Replace all non alpha-numeric characters with _
return re.sub('[^0-9a-zA-Z]+', '_', s)
# Second granularity; Per-minute
class FixedSlidingWindow():
def __init__(self, start_time=None):
@@ -99,7 +105,8 @@ class BroadcastWebsocketStatsManager():
Stringified verion of all the stats
'''
redis_conn = redis.Redis.from_url(settings.BROKER_URL)
return redis_conn.get(BROADCAST_WEBSOCKET_REDIS_KEY_NAME)
stats_str = redis_conn.get(BROADCAST_WEBSOCKET_REDIS_KEY_NAME) or b''
return parser.text_string_to_metric_families(stats_str.decode('UTF-8'))
class BroadcastWebsocketStats():
@@ -109,8 +116,8 @@ class BroadcastWebsocketStats():
self._registry = CollectorRegistry()
# TODO: More robust replacement
self.name = self.safe_name(self._local_hostname)
self.remote_name = self.safe_name(self._remote_hostname)
self.name = safe_name(self._local_hostname)
self.remote_name = safe_name(self._remote_hostname)
self._messages_received_total = Counter(f'awx_{self.remote_name}_messages_received_total',
'Number of messages received, to be forwarded, by the broadcast websocket system',
@@ -122,6 +129,7 @@ class BroadcastWebsocketStats():
'Websocket broadcast connection',
states=['disconnected', 'connected'],
registry=self._registry)
self._connection.state('disconnected')
self._connection_start = Gauge(f'awx_{self.remote_name}_connection_start',
'Time the connection was established',
registry=self._registry)
@@ -131,10 +139,6 @@ class BroadcastWebsocketStats():
registry=self._registry)
self._internal_messages_received_per_minute = FixedSlidingWindow()
def safe_name(self, s):
# Replace all non alpha-numeric characters with _
return re.sub('[^0-9a-zA-Z]+', '_', s)
def unregister(self):
self._registry.unregister(f'awx_{self.remote_name}_messages_received')
self._registry.unregister(f'awx_{self.remote_name}_connection')

View File

@@ -122,22 +122,27 @@ def cred_type_counts(since):
return counts
@register('inventory_counts', '1.0')
@register('inventory_counts', '1.2')
def inventory_counts(since):
counts = {}
for inv in models.Inventory.objects.filter(kind='').annotate(num_sources=Count('inventory_sources', distinct=True),
num_hosts=Count('hosts', distinct=True)).only('id', 'name', 'kind'):
source_list = []
for source in inv.inventory_sources.filter().annotate(num_hosts=Count('hosts', distinct=True)).values('name','source', 'num_hosts'):
source_list.append(source)
counts[inv.id] = {'name': inv.name,
'kind': inv.kind,
'hosts': inv.num_hosts,
'sources': inv.num_sources
'sources': inv.num_sources,
'source_list': source_list
}
for smart_inv in models.Inventory.objects.filter(kind='smart'):
counts[smart_inv.id] = {'name': smart_inv.name,
'kind': smart_inv.kind,
'num_hosts': smart_inv.hosts.count(),
'num_sources': smart_inv.inventory_sources.count()
'hosts': smart_inv.hosts.count(),
'sources': 0,
'source_list': []
}
return counts
@@ -222,7 +227,7 @@ def query_info(since, collection_type):
# Copies Job Events from db to a .csv to be shipped
@table_version('events_table.csv', '1.0')
@table_version('events_table.csv', '1.1')
@table_version('unified_jobs_table.csv', '1.0')
@table_version('unified_job_template_table.csv', '1.0')
def copy_tables(since, full_path):
@@ -249,6 +254,11 @@ def copy_tables(since, full_path):
main_jobevent.job_id,
main_jobevent.host_id,
main_jobevent.host_name
, CAST(main_jobevent.event_data::json->>'start' AS TIMESTAMP WITH TIME ZONE) AS start,
CAST(main_jobevent.event_data::json->>'end' AS TIMESTAMP WITH TIME ZONE) AS end,
main_jobevent.event_data::json->'duration' AS duration,
main_jobevent.event_data::json->'res'->'warnings' AS warnings,
main_jobevent.event_data::json->'res'->'deprecations' AS deprecations
FROM main_jobevent
WHERE main_jobevent.created > {}
ORDER BY main_jobevent.id ASC) TO STDOUT WITH CSV HEADER'''.format(since.strftime("'%Y-%m-%d %H:%M:%S'"))
@@ -276,9 +286,9 @@ def copy_tables(since, full_path):
main_unifiedjob.instance_group_id
FROM main_unifiedjob
JOIN django_content_type ON main_unifiedjob.polymorphic_ctype_id = django_content_type.id
JOIN main_organization ON main_organization.id = main_unifiedjob.organization_id
WHERE main_unifiedjob.created > {}
AND main_unifiedjob.launch_type != 'sync'
LEFT JOIN main_organization ON main_organization.id = main_unifiedjob.organization_id
WHERE (main_unifiedjob.created > {0} OR main_unifiedjob.finished > {0})
AND main_unifiedjob.launch_type != 'sync'
ORDER BY main_unifiedjob.id ASC) TO STDOUT WITH CSV HEADER'''.format(since.strftime("'%Y-%m-%d %H:%M:%S'"))
_copy_table(table='unified_jobs', query=unified_job_query, path=full_path)

View File

@@ -667,7 +667,7 @@ register(
allow_blank=True,
default='',
label=_('Logging Aggregator Username'),
help_text=_('Username for external log aggregator (if required).'),
help_text=_('Username for external log aggregator (if required; HTTP/s only).'),
category=_('Logging'),
category_slug='logging',
required=False,
@@ -679,7 +679,7 @@ register(
default='',
encrypted=True,
label=_('Logging Aggregator Password/Token'),
help_text=_('Password or authentication token for external log aggregator (if required).'),
help_text=_('Password or authentication token for external log aggregator (if required; HTTP/s only).'),
category=_('Logging'),
category_slug='logging',
required=False,

View File

@@ -38,7 +38,7 @@ ENV_BLACKLIST = frozenset((
'AD_HOC_COMMAND_ID', 'REST_API_URL', 'REST_API_TOKEN', 'MAX_EVENT_RES',
'CALLBACK_QUEUE', 'CALLBACK_CONNECTION', 'CACHE',
'JOB_CALLBACK_DEBUG', 'INVENTORY_HOSTVARS',
'AWX_HOST', 'PROJECT_REVISION'
'AWX_HOST', 'PROJECT_REVISION', 'SUPERVISOR_WEB_CONFIG_PATH'
))
# loggers that may be called in process of emitting a log

View File

@@ -1,6 +1,6 @@
import json
import logging
import datetime
import time
import hmac
import asyncio
@@ -29,7 +29,7 @@ class WebsocketSecretAuthHelper:
@classmethod
def construct_secret(cls):
nonce_serialized = "{}".format(int((datetime.datetime.utcnow() - datetime.datetime.fromtimestamp(0)).total_seconds()))
nonce_serialized = f"{int(time.time())}"
payload_dict = {
'secret': settings.BROADCAST_WEBSOCKET_SECRET,
'nonce': nonce_serialized
@@ -70,10 +70,12 @@ class WebsocketSecretAuthHelper:
raise ValueError("Invalid secret")
# Avoid timing attack and check the nonce after all the heavy lifting
now = datetime.datetime.utcnow()
nonce_parsed = datetime.datetime.fromtimestamp(int(nonce_parsed))
if (now - nonce_parsed).total_seconds() > nonce_tolerance:
raise ValueError("Potential replay attack or machine(s) time out of sync.")
now = int(time.time())
nonce_parsed = int(nonce_parsed)
nonce_diff = now - nonce_parsed
if abs(nonce_diff) > nonce_tolerance:
logger.warn(f"Potential replay attack or machine(s) time out of sync by {nonce_diff} seconds.")
raise ValueError("Potential replay attack or machine(s) time out of sync by {nonce_diff} seconds.")
return True
@@ -94,7 +96,7 @@ class BroadcastConsumer(AsyncJsonWebsocketConsumer):
WebsocketSecretAuthHelper.is_authorized(self.scope)
except Exception:
# TODO: log ip of connected client
logger.warn("Broadcast client failed to authorize.")
logger.warn("Broadcast client failed to authorize for reason.")
await self.close()
return
@@ -213,31 +215,6 @@ def _dump_payload(payload):
return None
async def emit_channel_notification_async(group, payload):
from awx.main.wsbroadcast import wrap_broadcast_msg # noqa
payload_dumped = _dump_payload(payload)
if payload_dumped is None:
return
channel_layer = get_channel_layer()
await channel_layer.group_send(
group,
{
"type": "internal.message",
"text": payload_dumped
},
)
await channel_layer.group_send(
settings.BROADCAST_WEBSOCKET_GROUP_NAME,
{
"type": "internal.message",
"text": wrap_broadcast_msg(group, payload_dumped),
},
)
def emit_channel_notification(group, payload):
from awx.main.wsbroadcast import wrap_broadcast_msg # noqa

View File

@@ -118,9 +118,14 @@ class AWXConsumerRedis(AWXConsumerBase):
queue = redis.Redis.from_url(settings.BROKER_URL)
while True:
res = queue.blpop(self.queues)
res = json.loads(res[1])
self.process_task(res)
try:
res = queue.blpop(self.queues)
res = json.loads(res[1])
self.process_task(res)
except redis.exceptions.RedisError:
logger.exception(f"encountered an error communicating with redis")
except (json.JSONDecodeError, KeyError):
logger.exception(f"failed to decode JSON message from redis")
if self.should_stop:
return

View File

@@ -268,13 +268,6 @@ class IsolatedManager(object):
# in the final sync
self.consume_events()
# emit an EOF event
event_data = {
'event': 'EOF',
'final_counter': len(self.handled_events)
}
self.event_handler(event_data)
return status, rc
def consume_events(self):
@@ -287,7 +280,7 @@ class IsolatedManager(object):
if os.path.exists(events_path):
for event in set(os.listdir(events_path)) - self.handled_events:
path = os.path.join(events_path, event)
if os.path.exists(path):
if os.path.exists(path) and os.path.isfile(path):
try:
event_data = json.load(
open(os.path.join(events_path, event), 'r')
@@ -420,8 +413,4 @@ class IsolatedManager(object):
status, rc = self.dispatch(playbook, module, module_args)
if status == 'successful':
status, rc = self.check()
else:
# emit an EOF event
event_data = {'event': 'EOF', 'final_counter': 0}
self.event_handler(event_data)
return status, rc

View File

@@ -496,12 +496,6 @@ class Command(BaseCommand):
group_names = all_group_names[offset:(offset + self._batch_size)]
for group_pk in groups_qs.filter(name__in=group_names).values_list('pk', flat=True):
del_group_pks.discard(group_pk)
if self.inventory_source.deprecated_group_id in del_group_pks: # TODO: remove in 3.3
logger.warning(
'Group "%s" from v1 API is not deleted by overwrite',
self.inventory_source.deprecated_group.name
)
del_group_pks.discard(self.inventory_source.deprecated_group_id)
# Now delete all remaining groups in batches.
all_del_pks = sorted(list(del_group_pks))
for offset in range(0, len(all_del_pks), self._batch_size):
@@ -534,12 +528,6 @@ class Command(BaseCommand):
# Set of all host pks managed by this inventory source
all_source_host_pks = self._existing_host_pks()
for db_group in db_groups.all():
if self.inventory_source.deprecated_group_id == db_group.id: # TODO: remove in 3.3
logger.debug(
'Group "%s" from v1 API child group/host connections preserved',
db_group.name
)
continue
# Delete child group relationships not present in imported data.
db_children = db_group.children
db_children_name_pk_map = dict(db_children.values_list('name', 'pk'))

View File

@@ -7,7 +7,6 @@ from django.core.cache import cache as django_cache
from django.core.management.base import BaseCommand
from django.db import connection as django_connection
from awx.main.utils.handlers import AWXProxyHandler
from awx.main.dispatch import get_local_queuename, reaper
from awx.main.dispatch.control import Control
from awx.main.dispatch.pool import AutoscalePool
@@ -56,11 +55,6 @@ class Command(BaseCommand):
reaper.reap()
consumer = None
# don't ship external logs inside the dispatcher's parent process
# this exists to work around a race condition + deadlock bug on fork
# in cpython itself:
# https://bugs.python.org/issue37429
AWXProxyHandler.disable()
try:
queues = ['tower_broadcast_all', get_local_queuename()]
consumer = AWXConsumerPG(

View File

@@ -2,10 +2,20 @@
# All Rights Reserved.
import logging
import asyncio
import datetime
import re
import redis
from datetime import datetime as dt
from django.core.management.base import BaseCommand
from django.db.models import Q
from awx.main.analytics.broadcast_websocket import (
BroadcastWebsocketStatsManager,
safe_name,
)
from awx.main.wsbroadcast import BroadcastWebsocketManager
from awx.main.models.ha import Instance
logger = logging.getLogger('awx.main.wsbroadcast')
@@ -14,7 +24,106 @@ logger = logging.getLogger('awx.main.wsbroadcast')
class Command(BaseCommand):
help = 'Launch the websocket broadcaster'
def add_arguments(self, parser):
parser.add_argument('--status', dest='status', action='store_true',
help='print the internal state of any running broadcast websocket')
@classmethod
def display_len(cls, s):
return len(re.sub('\x1b.*?m', '', s))
@classmethod
def _format_lines(cls, host_stats, padding=5):
widths = [0 for i in host_stats[0]]
for entry in host_stats:
for i, e in enumerate(entry):
if Command.display_len(e) > widths[i]:
widths[i] = Command.display_len(e)
paddings = [padding for i in widths]
lines = []
for entry in host_stats:
line = ""
for pad, width, value in zip(paddings, widths, entry):
if len(value) > Command.display_len(value):
width += len(value) - Command.display_len(value)
total_width = width + pad
line += f'{value:{total_width}}'
lines.append(line)
return lines
@classmethod
def get_connection_status(cls, me, hostnames, data):
host_stats = [('hostname', 'state', 'start time', 'duration (sec)')]
for h in hostnames:
connection_color = '91' # red
h_safe = safe_name(h)
prefix = f'awx_{h_safe}'
connection_state = data.get(f'{prefix}_connection', 'N/A')
connection_started = 'N/A'
connection_duration = 'N/A'
if connection_state is None:
connection_state = 'unknown'
if connection_state == 'connected':
connection_color = '92' # green
connection_started = data.get(f'{prefix}_connection_start', 'Error')
if connection_started != 'Error':
connection_started = datetime.datetime.fromtimestamp(connection_started)
connection_duration = int((dt.now() - connection_started).total_seconds())
connection_state = f'\033[{connection_color}m{connection_state}\033[0m'
host_stats.append((h, connection_state, str(connection_started), str(connection_duration)))
return host_stats
@classmethod
def get_connection_stats(cls, me, hostnames, data):
host_stats = [('hostname', 'total', 'per minute')]
for h in hostnames:
h_safe = safe_name(h)
prefix = f'awx_{h_safe}'
messages_total = data.get(f'{prefix}_messages_received', '0')
messages_per_minute = data.get(f'{prefix}_messages_received_per_minute', '0')
host_stats.append((h, str(int(messages_total)), str(int(messages_per_minute))))
return host_stats
def handle(self, *arg, **options):
if options.get('status'):
try:
stats_all = BroadcastWebsocketStatsManager.get_stats_sync()
except redis.exceptions.ConnectionError as e:
print(f"Unable to get Broadcast Websocket Status. Failed to connect to redis {e}")
return
data = {}
for family in stats_all:
if family.type == 'gauge' and len(family.samples) > 1:
for sample in family.samples:
if sample.value >= 1:
data[family.name] = sample.labels[family.name]
break
else:
data[family.name] = family.samples[0].value
me = Instance.objects.me()
hostnames = [i.hostname for i in Instance.objects.exclude(Q(hostname=me.hostname) | Q(rampart_groups__controller__isnull=False))]
host_stats = Command.get_connection_status(me, hostnames, data)
lines = Command._format_lines(host_stats)
print(f'Broadcast websocket connection status from "{me.hostname}" to:')
print('\n'.join(lines))
host_stats = Command.get_connection_stats(me, hostnames, data)
lines = Command._format_lines(host_stats)
print(f'\nBroadcast websocket connection stats from "{me.hostname}" to:')
print('\n'.join(lines))
return
try:
broadcast_websocket_mgr = BroadcastWebsocketManager()
task = broadcast_websocket_mgr.start()

View File

@@ -12,23 +12,19 @@ import urllib.parse
from django.conf import settings
from django.contrib.auth.models import User
from django.db.models.signals import post_save
from django.db.migrations.executor import MigrationExecutor
from django.db import IntegrityError, connection
from django.utils.functional import curry
from django.db import connection
from django.shortcuts import get_object_or_404, redirect
from django.apps import apps
from django.utils.deprecation import MiddlewareMixin
from django.utils.translation import ugettext_lazy as _
from django.urls import reverse, resolve
from awx.main.models import ActivityStream
from awx.main.utils.named_url_graph import generate_graph, GraphNode
from awx.conf import fields, register
logger = logging.getLogger('awx.main.middleware')
analytics_logger = logging.getLogger('awx.analytics.activity_stream')
perf_logger = logging.getLogger('awx.analytics.performance')
@@ -76,61 +72,6 @@ class TimingMiddleware(threading.local, MiddlewareMixin):
return filepath
class ActivityStreamMiddleware(threading.local, MiddlewareMixin):
def __init__(self, get_response=None):
self.disp_uid = None
self.instance_ids = []
super().__init__(get_response)
def process_request(self, request):
if hasattr(request, 'user') and request.user.is_authenticated:
user = request.user
else:
user = None
set_actor = curry(self.set_actor, user)
self.disp_uid = str(uuid.uuid1())
self.instance_ids = []
post_save.connect(set_actor, sender=ActivityStream, dispatch_uid=self.disp_uid, weak=False)
def process_response(self, request, response):
drf_request = getattr(request, 'drf_request', None)
drf_user = getattr(drf_request, 'user', None)
if self.disp_uid is not None:
post_save.disconnect(dispatch_uid=self.disp_uid)
for instance in ActivityStream.objects.filter(id__in=self.instance_ids):
if drf_user and drf_user.id:
instance.actor = drf_user
try:
instance.save(update_fields=['actor'])
analytics_logger.info('Activity Stream update entry for %s' % str(instance.object1),
extra=dict(changes=instance.changes, relationship=instance.object_relationship_type,
actor=drf_user.username, operation=instance.operation,
object1=instance.object1, object2=instance.object2))
except IntegrityError:
logger.debug("Integrity Error saving Activity Stream instance for id : " + str(instance.id))
# else:
# obj1_type_actual = instance.object1_type.split(".")[-1]
# if obj1_type_actual in ("InventoryUpdate", "ProjectUpdate", "Job") and instance.id is not None:
# instance.delete()
self.instance_ids = []
return response
def set_actor(self, user, sender, instance, **kwargs):
if sender == ActivityStream:
if isinstance(user, User) and instance.actor is None:
user = User.objects.filter(id=user.id)
if user.exists():
user = user[0]
instance.actor = user
else:
if instance.id not in self.instance_ids:
self.instance_ids.append(instance.id)
class SessionTimeoutMiddleware(MiddlewareMixin):
"""
Resets the session timeout for both the UI and the actual session for the API

View File

@@ -464,7 +464,7 @@ class Migration(migrations.Migration):
migrations.AddField(
model_name='unifiedjob',
name='instance_group',
field=models.ForeignKey(on_delete=models.SET_NULL, default=None, blank=True, to='main.InstanceGroup', help_text='The Rampart/Instance group the job was run under', null=True),
field=models.ForeignKey(on_delete=models.SET_NULL, default=None, blank=True, to='main.InstanceGroup', help_text='The Instance group the job was run under', null=True),
),
migrations.AddField(
model_name='unifiedjobtemplate',

View File

@@ -16,6 +16,6 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='unifiedjob',
name='instance_group',
field=models.ForeignKey(blank=True, default=None, help_text='The Rampart/Instance group the job was run under', null=True, on_delete=awx.main.utils.polymorphic.SET_NULL, to='main.InstanceGroup'),
field=models.ForeignKey(blank=True, default=None, help_text='The Instance group the job was run under', null=True, on_delete=awx.main.utils.polymorphic.SET_NULL, to='main.InstanceGroup'),
),
]

View File

@@ -0,0 +1,39 @@
# Generated by Django 2.2.11 on 2020-04-03 00:11
from django.db import migrations, models
def remove_manual_inventory_sources(apps, schema_editor):
'''Previously we would automatically create inventory sources after
Group creation and we would use the parent Group as our interface for the user.
During that process we would create InventorySource that had a source of "manual".
'''
InventoryUpdate = apps.get_model('main', 'InventoryUpdate')
InventoryUpdate.objects.filter(source='').delete()
InventorySource = apps.get_model('main', 'InventorySource')
InventorySource.objects.filter(source='').delete()
class Migration(migrations.Migration):
dependencies = [
('main', '0113_v370_event_bigint'),
]
operations = [
migrations.RemoveField(
model_name='inventorysource',
name='deprecated_group',
),
migrations.RunPython(remove_manual_inventory_sources),
migrations.AlterField(
model_name='inventorysource',
name='source',
field=models.CharField(choices=[('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('rhv', 'Red Hat Virtualization'), ('tower', 'Ansible Tower'), ('custom', 'Custom Script')], default=None, max_length=32),
),
migrations.AlterField(
model_name='inventoryupdate',
name='source',
field=models.CharField(choices=[('file', 'File, Directory or Script'), ('scm', 'Sourced from a Project'), ('ec2', 'Amazon EC2'), ('gce', 'Google Compute Engine'), ('azure_rm', 'Microsoft Azure Resource Manager'), ('vmware', 'VMware vCenter'), ('satellite6', 'Red Hat Satellite 6'), ('cloudforms', 'Red Hat CloudForms'), ('openstack', 'OpenStack'), ('rhv', 'Red Hat Virtualization'), ('tower', 'Ansible Tower'), ('custom', 'Custom Script')], default=None, max_length=32),
),
]

View File

@@ -3,7 +3,7 @@
# Django
from django.conf import settings # noqa
from django.db import connection, ProgrammingError
from django.db import connection
from django.db.models.signals import pre_delete # noqa
# AWX
@@ -91,14 +91,13 @@ def enforce_bigint_pk_migration():
'main_systemjobevent'
):
with connection.cursor() as cursor:
try:
cursor.execute(f'SELECT MAX(id) FROM _old_{tblname}')
if cursor.fetchone():
from awx.main.tasks import migrate_legacy_event_data
migrate_legacy_event_data.apply_async([tblname])
except ProgrammingError:
# the table is gone (migration is unnecessary)
pass
cursor.execute(
'SELECT 1 FROM information_schema.tables WHERE table_name=%s',
(f'_old_{tblname}',)
)
if bool(cursor.rowcount):
from awx.main.tasks import migrate_legacy_event_data
migrate_legacy_event_data.apply_async([tblname])
def cleanup_created_modified_by(sender, **kwargs):

View File

@@ -821,7 +821,6 @@ class InventorySourceOptions(BaseModel):
injectors = dict()
SOURCE_CHOICES = [
('', _('Manual')),
('file', _('File, Directory or Script')),
('scm', _('Sourced from a Project')),
('ec2', _('Amazon EC2')),
@@ -932,8 +931,8 @@ class InventorySourceOptions(BaseModel):
source = models.CharField(
max_length=32,
choices=SOURCE_CHOICES,
blank=True,
default='',
blank=False,
default=None,
)
source_path = models.CharField(
max_length=1024,
@@ -1237,14 +1236,6 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
on_delete=models.CASCADE,
)
deprecated_group = models.OneToOneField(
'Group',
related_name='deprecated_inventory_source',
null=True,
default=None,
on_delete=models.CASCADE,
)
source_project = models.ForeignKey(
'Project',
related_name='scm_inventory_sources',
@@ -1414,16 +1405,6 @@ class InventorySource(UnifiedJobTemplate, InventorySourceOptions, CustomVirtualE
started=list(started_notification_templates),
success=list(success_notification_templates))
def clean_source(self): # TODO: remove in 3.3
source = self.source
if source and self.deprecated_group:
qs = self.deprecated_group.inventory_sources.filter(source__in=CLOUD_INVENTORY_SOURCES)
existing_sources = qs.exclude(pk=self.pk)
if existing_sources.count():
s = u', '.join([x.deprecated_group.name for x in existing_sources])
raise ValidationError(_('Unable to configure this item for cloud sync. It is already managed by %s.') % s)
return source
def clean_update_on_project_update(self):
if self.update_on_project_update is True and \
self.source == 'scm' and \
@@ -1512,8 +1493,6 @@ class InventoryUpdate(UnifiedJob, InventorySourceOptions, JobNotificationMixin,
if self.inventory_source.inventory is not None:
websocket_data.update(dict(inventory_id=self.inventory_source.inventory.pk))
if self.inventory_source.deprecated_group is not None: # TODO: remove in 3.3
websocket_data.update(dict(group_id=self.inventory_source.deprecated_group.id))
return websocket_data
def get_absolute_url(self, request=None):

View File

@@ -191,7 +191,7 @@ class Schedule(PrimordialModel, LaunchTimeConfig):
return rrule
@classmethod
def rrulestr(cls, rrule, **kwargs):
def rrulestr(cls, rrule, fast_forward=True, **kwargs):
"""
Apply our own custom rrule parsing requirements
"""
@@ -205,11 +205,17 @@ class Schedule(PrimordialModel, LaunchTimeConfig):
'A valid TZID must be provided (e.g., America/New_York)'
)
if 'MINUTELY' in rrule or 'HOURLY' in rrule:
if fast_forward and ('MINUTELY' in rrule or 'HOURLY' in rrule):
try:
first_event = x[0]
if first_event < now() - datetime.timedelta(days=365 * 5):
raise ValueError('RRULE values with more than 1000 events are not allowed.')
if first_event < now():
# hourly/minutely rrules with far-past DTSTART values
# are *really* slow to precompute
# start *from* one week ago to speed things up drastically
dtstart = x._rrule[0]._dtstart.strftime(':%Y%m%dT')
new_start = (now() - datetime.timedelta(days=7)).strftime(':%Y%m%dT')
new_rrule = rrule.replace(dtstart, new_start)
return Schedule.rrulestr(new_rrule, fast_forward=False)
except IndexError:
pass
return x

View File

@@ -707,7 +707,7 @@ class UnifiedJob(PolymorphicModel, PasswordFieldsModel, CommonModelNameNotUnique
null=True,
default=None,
on_delete=polymorphic.SET_NULL,
help_text=_('The Rampart/Instance group the job was run under'),
help_text=_('The Instance group the job was run under'),
)
organization = models.ForeignKey(
'Organization',

View File

@@ -89,7 +89,8 @@ class GrafanaBackend(AWXBaseEmailBackend, CustomNotificationBase):
grafana_data['isRegion'] = self.isRegion
grafana_data['dashboardId'] = self.dashboardId
grafana_data['panelId'] = self.panelId
grafana_data['tags'] = self.annotation_tags
if self.annotation_tags:
grafana_data['tags'] = self.annotation_tags
grafana_data['text'] = m.subject
grafana_headers['Authorization'] = "Bearer {}".format(self.grafana_key)
grafana_headers['Content-Type'] = "application/json"

View File

@@ -52,6 +52,7 @@ from awx.conf.utils import conf_to_dict
__all__ = []
logger = logging.getLogger('awx.main.signals')
analytics_logger = logging.getLogger('awx.analytics.activity_stream')
# Update has_active_failures for inventory/groups when a Host/Group is deleted,
# when a Host-Group or Group-Group relationship is updated, or when a Job is deleted
@@ -363,12 +364,24 @@ def model_serializer_mapping():
}
def emit_activity_stream_change(instance):
if 'migrate' in sys.argv:
# don't emit activity stream external logs during migrations, it
# could be really noisy
return
from awx.api.serializers import ActivityStreamSerializer
actor = None
if instance.actor:
actor = instance.actor.username
summary_fields = ActivityStreamSerializer(instance).get_summary_fields(instance)
analytics_logger.info('Activity Stream update entry for %s' % str(instance.object1),
extra=dict(changes=instance.changes, relationship=instance.object_relationship_type,
actor=actor, operation=instance.operation,
object1=instance.object1, object2=instance.object2, summary_fields=summary_fields))
def activity_stream_create(sender, instance, created, **kwargs):
if created and activity_stream_enabled:
# TODO: remove deprecated_group conditional in 3.3
# Skip recording any inventory source directly associated with a group.
if isinstance(instance, InventorySource) and instance.deprecated_group:
return
_type = type(instance)
if getattr(_type, '_deferred', False):
return
@@ -399,6 +412,9 @@ def activity_stream_create(sender, instance, created, **kwargs):
else:
activity_entry.setting = conf_to_dict(instance)
activity_entry.save()
connection.on_commit(
lambda: emit_activity_stream_change(activity_entry)
)
def activity_stream_update(sender, instance, **kwargs):
@@ -430,15 +446,14 @@ def activity_stream_update(sender, instance, **kwargs):
else:
activity_entry.setting = conf_to_dict(instance)
activity_entry.save()
connection.on_commit(
lambda: emit_activity_stream_change(activity_entry)
)
def activity_stream_delete(sender, instance, **kwargs):
if not activity_stream_enabled:
return
# TODO: remove deprecated_group conditional in 3.3
# Skip recording any inventory source directly associated with a group.
if isinstance(instance, InventorySource) and instance.deprecated_group:
return
# Inventory delete happens in the task system rather than request-response-cycle.
# If we trigger this handler there we may fall into db-integrity-related race conditions.
# So we add flag verification to prevent normal signal handling. This funciton will be
@@ -467,6 +482,9 @@ def activity_stream_delete(sender, instance, **kwargs):
object1=object1,
actor=get_current_user_or_none())
activity_entry.save()
connection.on_commit(
lambda: emit_activity_stream_change(activity_entry)
)
def activity_stream_associate(sender, instance, **kwargs):
@@ -540,6 +558,9 @@ def activity_stream_associate(sender, instance, **kwargs):
activity_entry.role.add(role)
activity_entry.object_relationship_type = obj_rel
activity_entry.save()
connection.on_commit(
lambda: emit_activity_stream_change(activity_entry)
)
@receiver(current_user_getter)

View File

@@ -73,6 +73,7 @@ from awx.main.utils import (get_ssh_version, update_scm_url,
get_awx_version)
from awx.main.utils.ansible import read_ansible_config
from awx.main.utils.common import _get_ansible_version, get_custom_venv_choices
from awx.main.utils.external_logging import reconfigure_rsyslog
from awx.main.utils.safe_yaml import safe_dump, sanitize_jinja
from awx.main.utils.reload import stop_local_services
from awx.main.utils.pglock import advisory_lock
@@ -140,6 +141,9 @@ def dispatch_startup():
# and Tower fall out of use/support, we can probably just _assume_ that
# everybody has moved to bigint, and remove this code entirely
enforce_bigint_pk_migration()
# Update Tower's rsyslog.conf file based on loggins settings in the db
reconfigure_rsyslog()
def inform_cluster_of_shutdown():
@@ -280,6 +284,12 @@ def handle_setting_changes(setting_keys):
logger.debug('cache delete_many(%r)', cache_keys)
cache.delete_many(cache_keys)
if any([
setting.startswith('LOG_AGGREGATOR')
for setting in setting_keys
]):
connection.on_commit(reconfigure_rsyslog)
@task(queue='tower_broadcast_all')
def delete_project_files(project_path):
@@ -1466,7 +1476,7 @@ class BaseTask(object):
params.get('module'),
module_args,
ident=str(self.instance.pk))
self.event_ct = len(isolated_manager_instance.handled_events)
self.finished_callback(None)
else:
res = ansible_runner.interface.run(**params)
status = res.status

View File

@@ -0,0 +1,78 @@
import pytest
import tempfile
import os
import shutil
import csv
from django.utils.timezone import now
from django.db.backends.sqlite3.base import SQLiteCursorWrapper
from awx.main.analytics import collectors
from awx.main.models import (
ProjectUpdate,
InventorySource,
)
@pytest.fixture
def sqlite_copy_expert(request):
# copy_expert is postgres-specific, and SQLite doesn't support it; mock its
# behavior to test that it writes a file that contains stdout from events
path = tempfile.mkdtemp(prefix='copied_tables')
def write_stdout(self, sql, fd):
# Would be cool if we instead properly disected the SQL query and verified
# it that way. But instead, we just take the nieve approach here.
assert sql.startswith('COPY (')
assert sql.endswith(') TO STDOUT WITH CSV HEADER')
sql = sql.replace('COPY (', '')
sql = sql.replace(') TO STDOUT WITH CSV HEADER', '')
# Remove JSON style queries
# TODO: could replace JSON style queries with sqlite kind of equivalents
sql_new = []
for line in sql.split('\n'):
if line.find('main_jobevent.event_data::') == -1:
sql_new.append(line)
elif not line.endswith(','):
sql_new[-1] = sql_new[-1].rstrip(',')
sql = '\n'.join(sql_new)
self.execute(sql)
results = self.fetchall()
headers = [i[0] for i in self.description]
csv_handle = csv.writer(fd, delimiter=',', quoting=csv.QUOTE_ALL, escapechar='\\', lineterminator='\n')
csv_handle.writerow(headers)
csv_handle.writerows(results)
setattr(SQLiteCursorWrapper, 'copy_expert', write_stdout)
request.addfinalizer(lambda: shutil.rmtree(path))
request.addfinalizer(lambda: delattr(SQLiteCursorWrapper, 'copy_expert'))
return path
@pytest.mark.django_db
def test_copy_tables_unified_job_query(sqlite_copy_expert, project, inventory, job_template):
'''
Ensure that various unified job types are in the output of the query.
'''
time_start = now()
inv_src = InventorySource.objects.create(name="inventory_update1", inventory=inventory, source='gce')
project_update_name = ProjectUpdate.objects.create(project=project, name="project_update1").name
inventory_update_name = inv_src.create_unified_job().name
job_name = job_template.create_unified_job().name
with tempfile.TemporaryDirectory() as tmpdir:
collectors.copy_tables(time_start, tmpdir)
with open(os.path.join(tmpdir, 'unified_jobs_table.csv')) as f:
lines = ''.join([l for l in f])
assert project_update_name in lines
assert inventory_update_name in lines
assert job_name in lines

View File

@@ -13,10 +13,10 @@ def test_empty():
"active_host_count": 0,
"credential": 0,
"custom_inventory_script": 0,
"custom_virtualenvs": 0, # dev env ansible3
"custom_virtualenvs": 0, # dev env ansible3
"host": 0,
"inventory": 0,
"inventories": {'normal': 0, 'smart': 0},
"inventories": {"normal": 0, "smart": 0},
"job_template": 0,
"notification_template": 0,
"organization": 0,
@@ -27,28 +27,97 @@ def test_empty():
"user": 0,
"workflow_job_template": 0,
"unified_job": 0,
"pending_jobs": 0
"pending_jobs": 0,
}
@pytest.mark.django_db
def test_database_counts(organization_factory, job_template_factory,
workflow_job_template_factory):
objs = organization_factory('org', superusers=['admin'])
jt = job_template_factory('test', organization=objs.organization,
inventory='test_inv', project='test_project',
credential='test_cred')
workflow_job_template_factory('test')
def test_database_counts(
organization_factory, job_template_factory, workflow_job_template_factory
):
objs = organization_factory("org", superusers=["admin"])
jt = job_template_factory(
"test",
organization=objs.organization,
inventory="test_inv",
project="test_project",
credential="test_cred",
)
workflow_job_template_factory("test")
models.Team(organization=objs.organization).save()
models.Host(inventory=jt.inventory).save()
models.Schedule(
rrule='DTSTART;TZID=America/New_York:20300504T150000',
unified_job_template=jt.job_template
rrule="DTSTART;TZID=America/New_York:20300504T150000",
unified_job_template=jt.job_template,
).save()
models.CustomInventoryScript(organization=objs.organization).save()
counts = collectors.counts(None)
for key in ('organization', 'team', 'user', 'inventory', 'credential',
'project', 'job_template', 'workflow_job_template', 'host',
'schedule', 'custom_inventory_script'):
for key in (
"organization",
"team",
"user",
"inventory",
"credential",
"project",
"job_template",
"workflow_job_template",
"host",
"schedule",
"custom_inventory_script",
):
assert counts[key] == 1
@pytest.mark.django_db
def test_inventory_counts(organization_factory, inventory_factory):
(inv1, inv2, inv3) = [inventory_factory(f"inv-{i}") for i in range(3)]
s1 = inv1.inventory_sources.create(name="src1", source="ec2")
s2 = inv1.inventory_sources.create(name="src2", source="file")
s3 = inv1.inventory_sources.create(name="src3", source="gce")
s1.hosts.create(name="host1", inventory=inv1)
s1.hosts.create(name="host2", inventory=inv1)
s1.hosts.create(name="host3", inventory=inv1)
s2.hosts.create(name="host4", inventory=inv1)
s2.hosts.create(name="host5", inventory=inv1)
s3.hosts.create(name="host6", inventory=inv1)
s1 = inv2.inventory_sources.create(name="src1", source="ec2")
s1.hosts.create(name="host1", inventory=inv2)
s1.hosts.create(name="host2", inventory=inv2)
s1.hosts.create(name="host3", inventory=inv2)
inv_counts = collectors.inventory_counts(None)
assert {
inv1.id: {
"name": "inv-0",
"kind": "",
"hosts": 6,
"sources": 3,
"source_list": [
{"name": "src1", "source": "ec2", "num_hosts": 3},
{"name": "src2", "source": "file", "num_hosts": 2},
{"name": "src3", "source": "gce", "num_hosts": 1},
],
},
inv2.id: {
"name": "inv-1",
"kind": "",
"hosts": 3,
"sources": 1,
"source_list": [{"name": "src1", "source": "ec2", "num_hosts": 3}],
},
inv3.id: {
"name": "inv-2",
"kind": "",
"hosts": 0,
"sources": 0,
"source_list": [],
},
} == inv_counts

View File

@@ -1,7 +1,6 @@
import pytest
from awx.api.versioning import reverse
from awx.main.middleware import ActivityStreamMiddleware
from awx.main.models.activity_stream import ActivityStream
from awx.main.access import ActivityStreamAccess
from awx.conf.models import Setting
@@ -61,28 +60,6 @@ def test_ctint_activity_stream(monkeypatch, get, user, settings):
assert response.data['summary_fields']['setting'][0]['name'] == 'FOO'
@pytest.mark.django_db
def test_middleware_actor_added(monkeypatch, post, get, user, settings):
settings.ACTIVITY_STREAM_ENABLED = True
u = user('admin-poster', True)
url = reverse('api:organization_list')
response = post(url,
dict(name='test-org', description='test-desc'),
u,
middleware=ActivityStreamMiddleware())
assert response.status_code == 201
org_id = response.data['id']
activity_stream = ActivityStream.objects.filter(organization__pk=org_id).first()
url = reverse('api:activity_stream_detail', kwargs={'pk': activity_stream.pk})
response = get(url, u)
assert response.status_code == 200
assert response.data['summary_fields']['actor']['username'] == 'admin-poster'
@pytest.mark.django_db
def test_rbac_stream_resource_roles(activity_stream_entry, organization, org_admin, settings):
settings.ACTIVITY_STREAM_ENABLED = True

View File

@@ -972,7 +972,7 @@ def test_field_removal(put, organization, admin, credentialtype_ssh):
['insights_inventories', Inventory()],
['unifiedjobs', Job()],
['unifiedjobtemplates', JobTemplate()],
['unifiedjobtemplates', InventorySource()],
['unifiedjobtemplates', InventorySource(source='ec2')],
['projects', Project()],
['workflowjobnodes', WorkflowJobNode()],
])

View File

@@ -1,6 +1,8 @@
import pytest
import base64
import json
import time
import pytest
from django.db import connection
from django.test.utils import override_settings
@@ -326,6 +328,38 @@ def test_refresh_accesstoken(oauth_application, post, get, delete, admin):
assert original_refresh_token.revoked # is not None
@pytest.mark.django_db
def test_refresh_token_expiration_is_respected(oauth_application, post, get, delete, admin):
response = post(
reverse('api:o_auth2_application_token_list', kwargs={'pk': oauth_application.pk}),
{'scope': 'read'}, admin, expect=201
)
assert AccessToken.objects.count() == 1
assert RefreshToken.objects.count() == 1
refresh_token = RefreshToken.objects.get(token=response.data['refresh_token'])
refresh_url = drf_reverse('api:oauth_authorization_root_view') + 'token/'
short_lived = {
'ACCESS_TOKEN_EXPIRE_SECONDS': 1,
'AUTHORIZATION_CODE_EXPIRE_SECONDS': 1,
'REFRESH_TOKEN_EXPIRE_SECONDS': 1
}
time.sleep(1)
with override_settings(OAUTH2_PROVIDER=short_lived):
response = post(
refresh_url,
data='grant_type=refresh_token&refresh_token=' + refresh_token.token,
content_type='application/x-www-form-urlencoded',
HTTP_AUTHORIZATION='Basic ' + smart_str(base64.b64encode(smart_bytes(':'.join([
oauth_application.client_id, oauth_application.client_secret
]))))
)
assert response.status_code == 403
assert b'The refresh token has expired.' in response.content
assert RefreshToken.objects.filter(token=refresh_token).exists()
assert AccessToken.objects.count() == 1
assert RefreshToken.objects.count() == 1
@pytest.mark.django_db
def test_revoke_access_then_refreshtoken(oauth_application, post, get, delete, admin):

View File

@@ -1,6 +1,8 @@
import datetime
import pytest
from django.utils.encoding import smart_str
from django.utils.timezone import now
from awx.api.versioning import reverse
from awx.main.models import JobTemplate, Schedule
@@ -140,7 +142,6 @@ def test_encrypted_survey_answer(post, patch, admin_user, project, inventory, su
("DTSTART:20030925T104941Z RRULE:FREQ=DAILY;INTERVAL=10;COUNT=500;UNTIL=20040925T104941Z", "RRULE may not contain both COUNT and UNTIL"), # noqa
("DTSTART;TZID=America/New_York:20300308T050000Z RRULE:FREQ=DAILY;INTERVAL=1", "rrule parsing failed validation"),
("DTSTART:20300308T050000 RRULE:FREQ=DAILY;INTERVAL=1", "DTSTART cannot be a naive datetime"),
("DTSTART:19700101T000000Z RRULE:FREQ=MINUTELY;INTERVAL=1", "more than 1000 events are not allowed"), # noqa
])
def test_invalid_rrules(post, admin_user, project, inventory, rrule, error):
job_template = JobTemplate.objects.create(
@@ -342,6 +343,40 @@ def test_months_with_31_days(post, admin_user):
]
@pytest.mark.django_db
@pytest.mark.timeout(3)
@pytest.mark.parametrize('freq, delta, total_seconds', (
('MINUTELY', 1, 60),
('MINUTELY', 15, 15 * 60),
('HOURLY', 1, 3600),
('HOURLY', 4, 3600 * 4),
))
def test_really_old_dtstart(post, admin_user, freq, delta, total_seconds):
url = reverse('api:schedule_rrule')
# every <interval>, at the :30 second mark
rrule = f'DTSTART;TZID=America/New_York:20051231T000030 RRULE:FREQ={freq};INTERVAL={delta}'
start = now()
next_ten = post(url, {'rrule': rrule}, admin_user, expect=200).data['utc']
assert len(next_ten) == 10
# the first date is *in the future*
assert next_ten[0] >= start
# ...but *no more than* <interval> into the future
assert now() + datetime.timedelta(**{
'minutes' if freq == 'MINUTELY' else 'hours': delta
})
# every date in the list is <interval> greater than the last
for i, x in enumerate(next_ten):
if i == 0:
continue
assert x.second == 30
delta = (x - next_ten[i - 1])
assert delta.total_seconds() == total_seconds
def test_dst_rollback_duplicates(post, admin_user):
# From Nov 2 -> Nov 3, 2030, daylight savings ends and we "roll back" an hour.
# Make sure we don't "double count" duplicate times in the "rolled back"

View File

@@ -5,17 +5,13 @@
# Python
import pytest
import os
import time
from django.conf import settings
# Mock
from unittest import mock
# AWX
from awx.api.versioning import reverse
from awx.conf.models import Setting
from awx.main.utils.handlers import AWXProxyHandler, LoggingConnectivityException
from awx.conf.registry import settings_registry
TEST_GIF_LOGO = 'data:image/gif;base64,R0lGODlhIQAjAPIAAP//////AP8AAMzMAJmZADNmAAAAAAAAACH/C05FVFNDQVBFMi4wAwEAAAAh+QQJCgAHACwAAAAAIQAjAAADo3i63P4wykmrvTjrzZsxXfR94WMQBFh6RECuixHMLyzPQ13ewZCvow9OpzEAjIBj79cJJmU+FceIVEZ3QRozxBttmyOBwPBtisdX4Bha3oxmS+llFIPHQXQKkiSEXz9PeklHBzx3hYNyEHt4fmmAhHp8Nz45KgV5FgWFOFEGmwWbGqEfniChohmoQZ+oqRiZDZhEgk81I4mwg4EKVbxzrDHBEAkAIfkECQoABwAsAAAAACEAIwAAA6V4utz+MMpJq724GpP15p1kEAQYQmOwnWjgrmxjuMEAx8rsDjZ+fJvdLWQAFAHGWo8FRM54JqIRmYTigDrDMqZTbbbMj0CgjTLHZKvPQH6CTx+a2vKR0XbbOsoZ7SphG057gjl+c0dGgzeGNiaBiSgbBQUHBV08NpOVlkMSk0FKjZuURHiiOJxQnSGfQJuoEKREejK0dFRGjoiQt7iOuLx0rgxYEQkAIfkECQoABwAsAAAAACEAIwAAA7h4utxnxslJDSGR6nrz/owxYB64QUEwlGaVqlB7vrAJscsd3Lhy+wBArGEICo3DUFH4QDqK0GMy51xOgcGlEAfJ+iAFie62chR+jYKaSAuQGOqwJp7jGQRDuol+F/jxZWsyCmoQfwYwgoM5Oyg1i2w0A2WQIW2TPYOIkleQmy+UlYygoaIPnJmapKmqKiusMmSdpjxypnALtrcHioq3ury7hGm3dnVosVpMWFmwREZbddDOSsjVswcJACH5BAkKAAcALAAAAAAhACMAAAOxeLrc/jDKSZUxNS9DCNYV54HURQwfGRlDEFwqdLVuGjOsW9/Odb0wnsUAKBKNwsMFQGwyNUHckVl8bqI4o43lA26PNkv1S9DtNuOeVirw+aTI3qWAQwnud1vhLSnQLS0GeFF+GoVKNF0fh4Z+LDQ6Bn5/MTNmL0mAl2E3j2aclTmRmYCQoKEDiaRDKFhJez6UmbKyQowHtzy1uEl8DLCnEktrQ2PBD1NxSlXKIW5hz6cJACH5BAkKAAcALAAAAAAhACMAAAOkeLrc/jDKSau9OOvNlTFd9H3hYxAEWDJfkK5LGwTq+g0zDR/GgM+10A04Cm56OANgqTRmkDTmSOiLMgFOTM9AnFJHuexzYBAIijZf2SweJ8ttbbXLmd5+wBiJosSCoGF/fXEeS1g8gHl9hxODKkh4gkwVIwUekESIhA4FlgV3PyCWG52WI2oGnR2lnUWpqhqVEF4Xi7QjhpsshpOFvLosrnpoEAkAIfkECQoABwAsAAAAACEAIwAAA6l4utz+MMpJq71YGpPr3t1kEAQXQltQnk8aBCa7bMMLy4wx1G8s072PL6SrGQDI4zBThCU/v50zCVhidIYgNPqxWZkDg0AgxB2K4vEXbBSvr1JtZ3uOext0x7FqovF6OXtfe1UzdjAxhINPM013ChtJER8FBQeVRX8GlpggFZWWfjwblTiigGZnfqRmpUKbljKxDrNMeY2eF4R8jUiSur6/Z8GFV2WBtwwJACH5BAkKAAcALAAAAAAhACMAAAO6eLrcZi3KyQwhkGpq8f6ONWQgaAxB8JTfg6YkO50pzD5xhaurhCsGAKCnEw6NucNDCAkyI8ugdAhFKpnJJdMaeiofBejowUseCr9GYa0j1GyMdVgjBxoEuPSZXWKf7gKBeHtzMms0gHgGfDIVLztmjScvNZEyk28qjT40b5aXlHCbDgOhnzedoqOOlKeopaqrCy56sgtotbYKhYW6e7e9tsHBssO6eSTIm1peV0iuFUZDyU7NJnmcuQsJACH5BAkKAAcALAAAAAAhACMAAAOteLrc/jDKSZsxNS9DCNYV54Hh4H0kdAXBgKaOwbYX/Miza1vrVe8KA2AoJL5gwiQgeZz4GMXlcHl8xozQ3kW3KTajL9zsBJ1+sV2fQfALem+XAlRApxu4ioI1UpC76zJ4fRqDBzI+LFyFhH1iiS59fkgziW07jjRAG5QDeECOLk2Tj6KjnZafW6hAej6Smgevr6yysza2tiCuMasUF2Yov2gZUUQbU8YaaqjLpQkAOw==' # NOQA
@@ -237,73 +233,95 @@ def test_ui_settings(get, put, patch, delete, admin):
@pytest.mark.django_db
def test_logging_aggregrator_connection_test_requires_superuser(get, post, alice):
def test_logging_aggregator_connection_test_requires_superuser(post, alice):
url = reverse('api:setting_logging_test')
post(url, {}, user=alice, expect=403)
@pytest.mark.parametrize('key', [
'LOG_AGGREGATOR_TYPE',
'LOG_AGGREGATOR_HOST',
@pytest.mark.django_db
def test_logging_aggregator_connection_test_not_enabled(post, admin):
url = reverse('api:setting_logging_test')
resp = post(url, {}, user=admin, expect=409)
assert 'Logging not enabled' in resp.data.get('error')
def _mock_logging_defaults():
# Pre-populate settings obj with defaults
class MockSettings:
pass
mock_settings_obj = MockSettings()
mock_settings_json = dict()
for key in settings_registry.get_registered_settings(category_slug='logging'):
value = settings_registry.get_setting_field(key).get_default()
setattr(mock_settings_obj, key, value)
mock_settings_json[key] = value
setattr(mock_settings_obj, 'MAX_EVENT_RES_DATA', 700000)
return mock_settings_obj, mock_settings_json
@pytest.mark.parametrize('key, value, error', [
['LOG_AGGREGATOR_TYPE', 'logstash', 'Cannot enable log aggregator without providing host.'],
['LOG_AGGREGATOR_HOST', 'https://logstash', 'Cannot enable log aggregator without providing type.']
])
@pytest.mark.django_db
def test_logging_aggregrator_connection_test_bad_request(get, post, admin, key):
url = reverse('api:setting_logging_test')
resp = post(url, {}, user=admin, expect=400)
assert 'This field is required.' in resp.data.get(key, [])
@pytest.mark.django_db
def test_logging_aggregrator_connection_test_valid(mocker, get, post, admin):
with mock.patch.object(AWXProxyHandler, 'perform_test') as perform_test:
url = reverse('api:setting_logging_test')
user_data = {
'LOG_AGGREGATOR_TYPE': 'logstash',
'LOG_AGGREGATOR_HOST': 'localhost',
'LOG_AGGREGATOR_PORT': 8080,
'LOG_AGGREGATOR_USERNAME': 'logger',
'LOG_AGGREGATOR_PASSWORD': 'mcstash'
}
post(url, user_data, user=admin, expect=200)
args, kwargs = perform_test.call_args_list[0]
create_settings = kwargs['custom_settings']
for k, v in user_data.items():
assert hasattr(create_settings, k)
assert getattr(create_settings, k) == v
@pytest.mark.django_db
def test_logging_aggregrator_connection_test_with_masked_password(mocker, patch, post, admin):
def test_logging_aggregator_missing_settings(put, post, admin, key, value, error):
_, mock_settings = _mock_logging_defaults()
mock_settings['LOG_AGGREGATOR_ENABLED'] = True
mock_settings[key] = value
url = reverse('api:setting_singleton_detail', kwargs={'category_slug': 'logging'})
patch(url, user=admin, data={'LOG_AGGREGATOR_PASSWORD': 'password123'}, expect=200)
time.sleep(1) # log settings are cached slightly
response = put(url, data=mock_settings, user=admin, expect=400)
assert error in str(response.data)
with mock.patch.object(AWXProxyHandler, 'perform_test') as perform_test:
url = reverse('api:setting_logging_test')
user_data = {
'LOG_AGGREGATOR_TYPE': 'logstash',
'LOG_AGGREGATOR_HOST': 'localhost',
'LOG_AGGREGATOR_PORT': 8080,
'LOG_AGGREGATOR_USERNAME': 'logger',
'LOG_AGGREGATOR_PASSWORD': '$encrypted$'
}
post(url, user_data, user=admin, expect=200)
args, kwargs = perform_test.call_args_list[0]
create_settings = kwargs['custom_settings']
assert getattr(create_settings, 'LOG_AGGREGATOR_PASSWORD') == 'password123'
@pytest.mark.parametrize('type, host, port, username, password', [
['logstash', 'localhost', 8080, 'logger', 'mcstash'],
['loggly', 'http://logs-01.loggly.com/inputs/1fd38090-hash-h4a$h-8d80-t0k3n71/tag/http/', None, None, None],
['splunk', 'https://yoursplunk:8088/services/collector/event', None, None, None],
['other', '97.221.40.41', 9000, 'logger', 'mcstash'],
['sumologic', 'https://endpoint5.collection.us2.sumologic.com/receiver/v1/http/Zagnw_f9XGr_zZgd-_EPM0hb8_rUU7_RU8Q==',
None, None, None]
])
@pytest.mark.django_db
def test_logging_aggregator_valid_settings(put, post, admin, type, host, port, username, password):
_, mock_settings = _mock_logging_defaults()
# type = 'splunk'
# host = 'https://yoursplunk:8088/services/collector/event'
mock_settings['LOG_AGGREGATOR_ENABLED'] = True
mock_settings['LOG_AGGREGATOR_TYPE'] = type
mock_settings['LOG_AGGREGATOR_HOST'] = host
if port:
mock_settings['LOG_AGGREGATOR_PORT'] = port
if username:
mock_settings['LOG_AGGREGATOR_USERNAME'] = username
if password:
mock_settings['LOG_AGGREGATOR_PASSWORD'] = password
url = reverse('api:setting_singleton_detail', kwargs={'category_slug': 'logging'})
response = put(url, data=mock_settings, user=admin, expect=200)
assert type in response.data.get('LOG_AGGREGATOR_TYPE')
assert host in response.data.get('LOG_AGGREGATOR_HOST')
if port:
assert port == response.data.get('LOG_AGGREGATOR_PORT')
if username:
assert username in response.data.get('LOG_AGGREGATOR_USERNAME')
if password: # Note: password should be encrypted
assert '$encrypted$' in response.data.get('LOG_AGGREGATOR_PASSWORD')
@pytest.mark.django_db
def test_logging_aggregrator_connection_test_invalid(mocker, get, post, admin):
with mock.patch.object(AWXProxyHandler, 'perform_test') as perform_test:
perform_test.side_effect = LoggingConnectivityException('404: Not Found')
url = reverse('api:setting_logging_test')
resp = post(url, {
'LOG_AGGREGATOR_TYPE': 'logstash',
'LOG_AGGREGATOR_HOST': 'localhost',
'LOG_AGGREGATOR_PORT': 8080
}, user=admin, expect=500)
assert resp.data == {'error': '404: Not Found'}
def test_logging_aggregator_connection_test_valid(put, post, admin):
_, mock_settings = _mock_logging_defaults()
type = 'other'
host = 'https://localhost'
mock_settings['LOG_AGGREGATOR_ENABLED'] = True
mock_settings['LOG_AGGREGATOR_TYPE'] = type
mock_settings['LOG_AGGREGATOR_HOST'] = host
# POST to save these mock settings
url = reverse('api:setting_singleton_detail', kwargs={'category_slug': 'logging'})
put(url, data=mock_settings, user=admin, expect=200)
# "Test" the logger
url = reverse('api:setting_logging_test')
post(url, {}, user=admin, expect=202)
@pytest.mark.django_db

View File

@@ -23,9 +23,9 @@ def _mk_project_update():
def _mk_inventory_update():
source = InventorySource()
source = InventorySource(source='ec2')
source.save()
iu = InventoryUpdate(inventory_source=source)
iu = InventoryUpdate(inventory_source=source, source='e2')
return iu

View File

@@ -123,7 +123,11 @@ def test_delete_project_update_in_active_state(project, delete, admin, status):
@pytest.mark.parametrize("status", list(TEST_STATES))
@pytest.mark.django_db
def test_delete_inventory_update_in_active_state(inventory_source, delete, admin, status):
i = InventoryUpdate.objects.create(inventory_source=inventory_source, status=status)
i = InventoryUpdate.objects.create(
inventory_source=inventory_source,
status=status,
source=inventory_source.source
)
url = reverse('api:inventory_update_detail', kwargs={'pk': i.pk})
delete(url, None, admin, expect=403)

View File

@@ -228,7 +228,7 @@ class TestINIImports:
assert inventory.hosts.count() == 1 # baseline worked
inv_src2 = inventory.inventory_sources.create(
name='bar', overwrite=True
name='bar', overwrite=True, source='ec2'
)
os.environ['INVENTORY_SOURCE_ID'] = str(inv_src2.pk)
os.environ['INVENTORY_UPDATE_ID'] = str(inv_src2.create_unified_job().pk)

View File

@@ -568,7 +568,10 @@ def inventory_source_factory(inventory_factory):
@pytest.fixture
def inventory_update(inventory_source):
return InventoryUpdate.objects.create(inventory_source=inventory_source)
return InventoryUpdate.objects.create(
inventory_source=inventory_source,
source=inventory_source.source
)
@pytest.fixture

View File

@@ -197,9 +197,10 @@ class TestRelatedJobs:
assert job.id in [jerb.id for jerb in group._get_related_jobs()]
def test_related_group_update(self, group):
src = group.inventory_sources.create(name='foo')
src = group.inventory_sources.create(name='foo', source='ec2')
job = InventoryUpdate.objects.create(
inventory_source=src
inventory_source=src,
source=src.source
)
assert job.id in [jerb.id for jerb in group._get_related_jobs()]

View File

@@ -109,6 +109,7 @@ class TestJobNotificationMixin(object):
kwargs = {}
if JobClass is InventoryUpdate:
kwargs['inventory_source'] = inventory_source
kwargs['source'] = inventory_source.source
elif JobClass is ProjectUpdate:
kwargs['project'] = project

View File

@@ -325,16 +325,19 @@ def test_dst_phantom_hour(job_template):
@pytest.mark.django_db
@pytest.mark.timeout(3)
def test_beginning_of_time(job_template):
# ensure that really large generators don't have performance issues
start = now()
rrule = 'DTSTART:19700101T000000Z RRULE:FREQ=MINUTELY;INTERVAL=1'
s = Schedule(
name='Some Schedule',
rrule=rrule,
unified_job_template=job_template
)
with pytest.raises(ValueError):
s.save()
s.save()
assert s.next_run > start
assert (s.next_run - start).total_seconds() < 60
@pytest.mark.django_db

View File

@@ -297,7 +297,10 @@ class TestInstanceGroupOrdering:
assert ad_hoc.preferred_instance_groups == [ig_inv, ig_org]
def test_inventory_update_instance_groups(self, instance_group_factory, inventory_source, default_instance_group):
iu = InventoryUpdate.objects.create(inventory_source=inventory_source)
iu = InventoryUpdate.objects.create(
inventory_source=inventory_source,
source=inventory_source.source
)
assert iu.preferred_instance_groups == [default_instance_group]
ig_org = instance_group_factory("OrgIstGrp", [default_instance_group.instances.first()])
ig_inv = instance_group_factory("InvIstGrp", [default_instance_group.instances.first()])

View File

@@ -186,7 +186,11 @@ def test_group(get, admin_user):
def test_inventory_source(get, admin_user):
test_org = Organization.objects.create(name='test_org')
test_inv = Inventory.objects.create(name='test_inv', organization=test_org)
test_source = InventorySource.objects.create(name='test_source', inventory=test_inv)
test_source = InventorySource.objects.create(
name='test_source',
inventory=test_inv,
source='ec2'
)
url = reverse('api:inventory_source_detail', kwargs={'pk': test_source.pk})
response = get(url, user=admin_user, expect=200)
assert response.data['related']['named_url'].endswith('/test_source++test_inv++test_org/')

View File

@@ -90,7 +90,7 @@ def test_inherited_notification_templates(get, post, user, organization, project
notification_templates.append(response.data['id'])
i = Inventory.objects.create(name='test', organization=organization)
i.save()
isrc = InventorySource.objects.create(name='test', inventory=i)
isrc = InventorySource.objects.create(name='test', inventory=i, source='ec2')
isrc.save()
jt = JobTemplate.objects.create(name='test', inventory=i, project=project, playbook='debug.yml')
jt.save()

View File

@@ -65,14 +65,29 @@ def test_job_template_access_read_level(jt_linked, rando):
assert not access.can_unattach(jt_linked, cred, 'credentials', {})
@pytest.mark.django_db
def test_project_use_access(project, rando):
project.use_role.members.add(rando)
access = JobTemplateAccess(rando)
assert access.can_add(None)
assert access.can_add({'project': project.id, 'ask_inventory_on_launch': True})
project2 = Project.objects.create(
name='second-project', scm_type=project.scm_type, playbook_files=project.playbook_files,
organization=project.organization,
)
project2.use_role.members.add(rando)
jt = JobTemplate.objects.create(project=project, ask_inventory_on_launch=True)
jt.admin_role.members.add(rando)
assert access.can_change(jt, {'project': project2.pk})
@pytest.mark.django_db
def test_job_template_access_use_level(jt_linked, rando):
access = JobTemplateAccess(rando)
jt_linked.project.use_role.members.add(rando)
jt_linked.inventory.use_role.members.add(rando)
jt_linked.organization.job_template_admin_role.members.add(rando)
jt_linked.admin_role.members.add(rando)
proj_pk = jt_linked.project.pk
org_pk = jt_linked.organization_id
assert access.can_change(jt_linked, {'job_type': 'check', 'project': proj_pk})
assert access.can_change(jt_linked, {'job_type': 'check', 'inventory': None})
@@ -80,8 +95,8 @@ def test_job_template_access_use_level(jt_linked, rando):
for cred in jt_linked.credentials.all():
assert access.can_unattach(jt_linked, cred, 'credentials', {})
assert access.can_add(dict(inventory=jt_linked.inventory.pk, project=proj_pk, organization=org_pk))
assert access.can_add(dict(project=proj_pk, organization=org_pk))
assert access.can_add(dict(inventory=jt_linked.inventory.pk, project=proj_pk))
assert access.can_add(dict(project=proj_pk))
@pytest.mark.django_db
@@ -94,17 +109,16 @@ def test_job_template_access_admin(role_names, jt_linked, rando):
assert not access.can_read(jt_linked)
assert not access.can_delete(jt_linked)
# Appoint this user as admin of the organization
jt_linked.organization.admin_role.members.add(rando)
org_pk = jt_linked.organization.id
# Appoint this user to the org role
organization = jt_linked.organization
for role_name in role_names:
getattr(organization, role_name).members.add(rando)
# Assign organization permission in the same way the create view does
organization = jt_linked.inventory.organization
ssh_cred.admin_role.parents.add(organization.admin_role)
proj_pk = jt_linked.project.pk
assert access.can_add(dict(inventory=jt_linked.inventory.pk, project=proj_pk, organization=org_pk))
assert access.can_add(dict(credential=ssh_cred.pk, project=proj_pk, organization=org_pk))
assert access.can_add(dict(inventory=jt_linked.inventory.pk, project=proj_pk))
for cred in jt_linked.credentials.all():
assert access.can_unattach(jt_linked, cred, 'credentials', {})
@@ -170,12 +184,10 @@ class TestOrphanJobTemplate:
@pytest.mark.job_permissions
def test_job_template_creator_access(project, organization, rando, post):
project.use_role.members.add(rando)
organization.job_template_admin_role.members.add(rando)
response = post(url=reverse('api:job_template_list'), data=dict(
name='newly-created-jt',
ask_inventory_on_launch=True,
project=project.pk,
organization=organization.id,
playbook='helloworld.yml'
), user=rando, expect=201)

View File

@@ -24,7 +24,11 @@ def test_implied_organization_subquery_inventory():
inventory = Inventory.objects.create(name='foo{}'.format(i))
else:
inventory = Inventory.objects.create(name='foo{}'.format(i), organization=org)
inv_src = InventorySource.objects.create(name='foo{}'.format(i), inventory=inventory)
inv_src = InventorySource.objects.create(
name='foo{}'.format(i),
inventory=inventory,
source='ec2'
)
sources = UnifiedJobTemplate.objects.annotate(
test_field=rbac.implicit_org_subquery(UnifiedJobTemplate, InventorySource)
)

View File

@@ -60,6 +60,8 @@ def test_org_user_role_attach(user, organization, inventory):
'''
admin = user('admin')
nonmember = user('nonmember')
other_org = Organization.objects.create(name="other_org")
other_org.member_role.members.add(nonmember)
inventory.admin_role.members.add(nonmember)
organization.admin_role.members.add(admin)
@@ -186,13 +188,17 @@ def test_need_all_orgs_to_admin_user(user):
# Orphaned user can be added to member role, only in special cases
@pytest.mark.django_db
def test_orphaned_user_allowed(org_admin, rando, organization):
def test_orphaned_user_allowed(org_admin, rando, organization, org_credential):
'''
We still allow adoption of orphaned* users by assigning them to
organization member role, but only in the situation where the
org admin already posesses indirect access to all of the user's roles
*orphaned means user is not a member of any organization
'''
# give a descendent role to rando, to trigger the conditional
# where all ancestor roles of rando should be in the set of
# org_admin roles.
org_credential.admin_role.members.add(rando)
role_access = RoleAccess(org_admin)
org_access = OrganizationAccess(org_admin)
assert role_access.can_attach(organization.member_role, rando, 'members', None)

View File

@@ -0,0 +1,174 @@
import pytest
from django.conf import settings
from awx.main.utils.external_logging import construct_rsyslog_conf_template
from awx.main.tests.functional.api.test_settings import _mock_logging_defaults
'''
# Example User Data
data_logstash = {
"LOG_AGGREGATOR_TYPE": "logstash",
"LOG_AGGREGATOR_HOST": "localhost",
"LOG_AGGREGATOR_PORT": 8080,
"LOG_AGGREGATOR_PROTOCOL": "tcp",
"LOG_AGGREGATOR_USERNAME": "logger",
"LOG_AGGREGATOR_PASSWORD": "mcstash"
}
data_netcat = {
"LOG_AGGREGATOR_TYPE": "other",
"LOG_AGGREGATOR_HOST": "localhost",
"LOG_AGGREGATOR_PORT": 9000,
"LOG_AGGREGATOR_PROTOCOL": "udp",
}
data_loggly = {
"LOG_AGGREGATOR_TYPE": "loggly",
"LOG_AGGREGATOR_HOST": "http://logs-01.loggly.com/inputs/1fd38090-2af1-4e1e-8d80-492899da0f71/tag/http/",
"LOG_AGGREGATOR_PORT": 8080,
"LOG_AGGREGATOR_PROTOCOL": "https"
}
'''
# Test reconfigure logging settings function
# name this whatever you want
@pytest.mark.parametrize(
'enabled, type, host, port, protocol, expected_config', [
(
True,
'loggly',
'http://logs-01.loggly.com/inputs/1fd38090-2af1-4e1e-8d80-492899da0f71/tag/http/',
None,
'https',
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="logs-01.loggly.com" serverport="80" usehttps="off" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="inputs/1fd38090-2af1-4e1e-8d80-492899da0f71/tag/http/")', # noqa
])
),
(
True, # localhost w/ custom UDP port
'other',
'localhost',
9000,
'udp',
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")',
'action(type="omfwd" target="localhost" port="9000" protocol="udp" action.resumeRetryCount="-1" action.resumeInterval="5" template="awx")', # noqa
])
),
(
True, # localhost w/ custom TCP port
'other',
'localhost',
9000,
'tcp',
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")',
'action(type="omfwd" target="localhost" port="9000" protocol="tcp" action.resumeRetryCount="-1" action.resumeInterval="5" template="awx")', # noqa
])
),
(
True, # https, default port 443
'splunk',
'https://yoursplunk/services/collector/event',
None,
None,
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk" serverport="443" usehttps="on" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
(
True, # http, default port 80
'splunk',
'http://yoursplunk/services/collector/event',
None,
None,
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk" serverport="80" usehttps="off" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
(
True, # https, custom port in URL string
'splunk',
'https://yoursplunk:8088/services/collector/event',
None,
None,
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk" serverport="8088" usehttps="on" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
(
True, # https, custom port explicitly specified
'splunk',
'https://yoursplunk/services/collector/event',
8088,
None,
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk" serverport="8088" usehttps="on" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
(
True, # no scheme specified in URL, default to https, respect custom port
'splunk',
'yoursplunk.org/services/collector/event',
8088,
'https',
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk.org" serverport="8088" usehttps="on" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
(
True, # respect custom http-only port
'splunk',
'http://yoursplunk.org/services/collector/event',
8088,
None,
'\n'.join([
'template(name="awx" type="string" string="%rawmsg-after-pri%")\nmodule(load="omhttp")',
'action(type="omhttp" server="yoursplunk.org" serverport="8088" usehttps="off" skipverifyhost="off" action.resumeRetryCount="-1" template="awx" errorfile="/var/log/tower/rsyslog.err" action.resumeInterval="5" restpath="services/collector/event")', # noqa
])
),
]
)
def test_rsyslog_conf_template(enabled, type, host, port, protocol, expected_config):
mock_settings, _ = _mock_logging_defaults()
# Set test settings
logging_defaults = getattr(settings, 'LOGGING')
setattr(mock_settings, 'LOGGING', logging_defaults)
setattr(mock_settings, 'LOGGING["handlers"]["external_logger"]["address"]', '/var/run/awx-rsyslog/rsyslog.sock')
setattr(mock_settings, 'LOG_AGGREGATOR_ENABLED', enabled)
setattr(mock_settings, 'LOG_AGGREGATOR_TYPE', type)
setattr(mock_settings, 'LOG_AGGREGATOR_HOST', host)
if port:
setattr(mock_settings, 'LOG_AGGREGATOR_PORT', port)
if protocol:
setattr(mock_settings, 'LOG_AGGREGATOR_PROTOCOL', protocol)
# create rsyslog conf template
tmpl = construct_rsyslog_conf_template(mock_settings)
# check validity of created template
assert expected_config in tmpl
def test_splunk_auth():
mock_settings, _ = _mock_logging_defaults()
# Set test settings
logging_defaults = getattr(settings, 'LOGGING')
setattr(mock_settings, 'LOGGING', logging_defaults)
setattr(mock_settings, 'LOG_AGGREGATOR_ENABLED', True)
setattr(mock_settings, 'LOG_AGGREGATOR_TYPE', 'splunk')
setattr(mock_settings, 'LOG_AGGREGATOR_HOST', 'example.org')
setattr(mock_settings, 'LOG_AGGREGATOR_PASSWORD', 'SECRET-TOKEN')
tmpl = construct_rsyslog_conf_template(mock_settings)
assert 'httpheaderkey="Authorization" httpheadervalue="Splunk SECRET-TOKEN"' in tmpl

View File

@@ -18,7 +18,7 @@ def test_send_messages():
requests_mock.post.assert_called_once_with(
'https://example.com/api/annotations',
headers={'Content-Type': 'application/json', 'Authorization': 'Bearer testapikey'},
json={'tags': [], 'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None, 'time': 60000, 'dashboardId': None},
json={'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None, 'time': 60000, 'dashboardId': None},
verify=True)
assert sent_messages == 1
@@ -36,7 +36,7 @@ def test_send_messages_with_no_verify_ssl():
requests_mock.post.assert_called_once_with(
'https://example.com/api/annotations',
headers={'Content-Type': 'application/json', 'Authorization': 'Bearer testapikey'},
json={'tags': [], 'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None,'time': 60000, 'dashboardId': None},
json={'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None,'time': 60000, 'dashboardId': None},
verify=False)
assert sent_messages == 1
@@ -54,7 +54,7 @@ def test_send_messages_with_dashboardid():
requests_mock.post.assert_called_once_with(
'https://example.com/api/annotations',
headers={'Content-Type': 'application/json', 'Authorization': 'Bearer testapikey'},
json={'tags': [], 'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None, 'time': 60000, 'dashboardId': 42},
json={'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': None, 'time': 60000, 'dashboardId': 42},
verify=True)
assert sent_messages == 1
@@ -72,7 +72,7 @@ def test_send_messages_with_panelid():
requests_mock.post.assert_called_once_with(
'https://example.com/api/annotations',
headers={'Content-Type': 'application/json', 'Authorization': 'Bearer testapikey'},
json={'tags': [], 'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': 42, 'time': 60000, 'dashboardId': None},
json={'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': 42, 'time': 60000, 'dashboardId': None},
verify=True)
assert sent_messages == 1
@@ -90,7 +90,7 @@ def test_send_messages_with_bothids():
requests_mock.post.assert_called_once_with(
'https://example.com/api/annotations',
headers={'Content-Type': 'application/json', 'Authorization': 'Bearer testapikey'},
json={'tags': [], 'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': 42, 'time': 60000, 'dashboardId': 42},
json={'text': 'test subject', 'isRegion': True, 'timeEnd': 120000, 'panelId': 42, 'time': 60000, 'dashboardId': 42},
verify=True)
assert sent_messages == 1

View File

@@ -1,393 +0,0 @@
# -*- coding: utf-8 -*-
import base64
import logging
import socket
import datetime
from dateutil.tz import tzutc
from io import StringIO
from uuid import uuid4
from unittest import mock
from django.conf import LazySettings
from django.utils.encoding import smart_str
import pytest
import requests
from requests_futures.sessions import FuturesSession
from awx.main.utils.handlers import (BaseHandler, BaseHTTPSHandler as HTTPSHandler,
TCPHandler, UDPHandler, _encode_payload_for_socket,
PARAM_NAMES, LoggingConnectivityException,
AWXProxyHandler)
from awx.main.utils.formatters import LogstashFormatter
@pytest.fixture()
def https_adapter():
class FakeHTTPSAdapter(requests.adapters.HTTPAdapter):
requests = []
status = 200
reason = None
def send(self, request, **kwargs):
self.requests.append(request)
resp = requests.models.Response()
resp.status_code = self.status
resp.reason = self.reason
resp.request = request
return resp
return FakeHTTPSAdapter()
@pytest.fixture()
def connection_error_adapter():
class ConnectionErrorAdapter(requests.adapters.HTTPAdapter):
def send(self, request, **kwargs):
err = requests.packages.urllib3.exceptions.SSLError()
raise requests.exceptions.ConnectionError(err, request=request)
return ConnectionErrorAdapter()
@pytest.fixture
def fake_socket(tmpdir_factory, request):
sok = socket.socket
sok.send = mock.MagicMock()
sok.connect = mock.MagicMock()
sok.setblocking = mock.MagicMock()
sok.close = mock.MagicMock()
return sok
def test_https_logging_handler_requests_async_implementation():
handler = HTTPSHandler()
assert isinstance(handler.session, FuturesSession)
def test_https_logging_handler_has_default_http_timeout():
handler = TCPHandler()
assert handler.tcp_timeout == 5
@pytest.mark.parametrize('param', ['host', 'port', 'indv_facts'])
def test_base_logging_handler_defaults(param):
handler = BaseHandler()
assert hasattr(handler, param) and getattr(handler, param) is None
@pytest.mark.parametrize('param', ['host', 'port', 'indv_facts'])
def test_base_logging_handler_kwargs(param):
handler = BaseHandler(**{param: 'EXAMPLE'})
assert hasattr(handler, param) and getattr(handler, param) == 'EXAMPLE'
@pytest.mark.parametrize('params', [
{
'LOG_AGGREGATOR_HOST': 'https://server.invalid',
'LOG_AGGREGATOR_PORT': 22222,
'LOG_AGGREGATOR_TYPE': 'loggly',
'LOG_AGGREGATOR_USERNAME': 'foo',
'LOG_AGGREGATOR_PASSWORD': 'bar',
'LOG_AGGREGATOR_INDIVIDUAL_FACTS': True,
'LOG_AGGREGATOR_TCP_TIMEOUT': 96,
'LOG_AGGREGATOR_VERIFY_CERT': False,
'LOG_AGGREGATOR_PROTOCOL': 'https'
},
{
'LOG_AGGREGATOR_HOST': 'https://server.invalid',
'LOG_AGGREGATOR_PORT': 22222,
'LOG_AGGREGATOR_PROTOCOL': 'udp'
}
])
def test_real_handler_from_django_settings(params):
settings = LazySettings()
settings.configure(**params)
handler = AWXProxyHandler().get_handler(custom_settings=settings)
# need the _reverse_ dictionary from PARAM_NAMES
attr_lookup = {}
for attr_name, setting_name in PARAM_NAMES.items():
attr_lookup[setting_name] = attr_name
for setting_name, val in params.items():
attr_name = attr_lookup[setting_name]
if attr_name == 'protocol':
continue
assert hasattr(handler, attr_name)
def test_invalid_kwarg_to_real_handler():
settings = LazySettings()
settings.configure(**{
'LOG_AGGREGATOR_HOST': 'https://server.invalid',
'LOG_AGGREGATOR_PORT': 22222,
'LOG_AGGREGATOR_PROTOCOL': 'udp',
'LOG_AGGREGATOR_VERIFY_CERT': False # setting not valid for UDP handler
})
handler = AWXProxyHandler().get_handler(custom_settings=settings)
assert not hasattr(handler, 'verify_cert')
def test_protocol_not_specified():
settings = LazySettings()
settings.configure(**{
'LOG_AGGREGATOR_HOST': 'https://server.invalid',
'LOG_AGGREGATOR_PORT': 22222,
'LOG_AGGREGATOR_PROTOCOL': None # awx/settings/defaults.py
})
handler = AWXProxyHandler().get_handler(custom_settings=settings)
assert isinstance(handler, logging.NullHandler)
def test_base_logging_handler_emit_system_tracking(dummy_log_record):
handler = BaseHandler(host='127.0.0.1', indv_facts=True)
handler.setFormatter(LogstashFormatter())
dummy_log_record.name = 'awx.analytics.system_tracking'
dummy_log_record.msg = None
dummy_log_record.inventory_id = 11
dummy_log_record.host_name = 'my_lucky_host'
dummy_log_record.job_id = 777
dummy_log_record.ansible_facts = {
"ansible_kernel": "4.4.66-boot2docker",
"ansible_machine": "x86_64",
"ansible_swapfree_mb": 4663,
}
dummy_log_record.ansible_facts_modified = datetime.datetime.now(tzutc()).isoformat()
sent_payloads = handler.emit(dummy_log_record)
assert len(sent_payloads) == 1
assert sent_payloads[0]['ansible_facts'] == dummy_log_record.ansible_facts
assert sent_payloads[0]['ansible_facts_modified'] == dummy_log_record.ansible_facts_modified
assert sent_payloads[0]['level'] == 'INFO'
assert sent_payloads[0]['logger_name'] == 'awx.analytics.system_tracking'
assert sent_payloads[0]['job_id'] == dummy_log_record.job_id
assert sent_payloads[0]['inventory_id'] == dummy_log_record.inventory_id
assert sent_payloads[0]['host_name'] == dummy_log_record.host_name
@pytest.mark.parametrize('host, port, normalized, hostname_only', [
('http://localhost', None, 'http://localhost', False),
('http://localhost', 8080, 'http://localhost:8080', False),
('https://localhost', 443, 'https://localhost:443', False),
('ftp://localhost', 443, 'ftp://localhost:443', False),
('https://localhost:550', 443, 'https://localhost:550', False),
('https://localhost:yoho/foobar', 443, 'https://localhost:443/foobar', False),
('https://localhost:yoho/foobar', None, 'https://localhost:yoho/foobar', False),
('http://splunk.server:8088/services/collector/event', 80,
'http://splunk.server:8088/services/collector/event', False),
('http://splunk.server/services/collector/event', 8088,
'http://splunk.server:8088/services/collector/event', False),
('splunk.server:8088/services/collector/event', 80,
'http://splunk.server:8088/services/collector/event', False),
('splunk.server/services/collector/event', 8088,
'http://splunk.server:8088/services/collector/event', False),
('localhost', None, 'http://localhost', False),
('localhost', 8080, 'http://localhost:8080', False),
('localhost', 4399, 'localhost', True),
('tcp://localhost:4399/foo/bar', 4399, 'localhost', True),
])
def test_base_logging_handler_host_format(host, port, normalized, hostname_only):
handler = BaseHandler(host=host, port=port)
assert handler._get_host(scheme='http', hostname_only=hostname_only) == normalized
@pytest.mark.parametrize(
'status, reason, exc',
[(200, '200 OK', None), (404, 'Not Found', LoggingConnectivityException)]
)
@pytest.mark.parametrize('protocol', ['http', 'https', None])
def test_https_logging_handler_connectivity_test(https_adapter, status, reason, exc, protocol):
host = 'example.org'
if protocol:
host = '://'.join([protocol, host])
https_adapter.status = status
https_adapter.reason = reason
settings = LazySettings()
settings.configure(**{
'LOG_AGGREGATOR_HOST': host,
'LOG_AGGREGATOR_PORT': 8080,
'LOG_AGGREGATOR_TYPE': 'logstash',
'LOG_AGGREGATOR_USERNAME': 'user',
'LOG_AGGREGATOR_PASSWORD': 'password',
'LOG_AGGREGATOR_LOGGERS': ['awx', 'activity_stream', 'job_events', 'system_tracking'],
'LOG_AGGREGATOR_PROTOCOL': 'https',
'CLUSTER_HOST_ID': '',
'LOG_AGGREGATOR_TOWER_UUID': str(uuid4()),
'LOG_AGGREGATOR_LEVEL': 'DEBUG',
})
class FakeHTTPSHandler(HTTPSHandler):
def __init__(self, *args, **kwargs):
super(FakeHTTPSHandler, self).__init__(*args, **kwargs)
self.session.mount('{}://'.format(protocol or 'https'), https_adapter)
def emit(self, record):
return super(FakeHTTPSHandler, self).emit(record)
with mock.patch.object(AWXProxyHandler, 'get_handler_class') as mock_get_class:
mock_get_class.return_value = FakeHTTPSHandler
if exc:
with pytest.raises(exc) as e:
AWXProxyHandler().perform_test(settings)
assert str(e).endswith('%s: %s' % (status, reason))
else:
assert AWXProxyHandler().perform_test(settings) is None
def test_https_logging_handler_logstash_auth_info():
handler = HTTPSHandler(message_type='logstash', username='bob', password='ansible')
handler._add_auth_information()
assert isinstance(handler.session.auth, requests.auth.HTTPBasicAuth)
assert handler.session.auth.username == 'bob'
assert handler.session.auth.password == 'ansible'
def test_https_logging_handler_splunk_auth_info():
handler = HTTPSHandler(message_type='splunk', password='ansible')
handler._add_auth_information()
assert handler.session.headers['Authorization'] == 'Splunk ansible'
assert handler.session.headers['Content-Type'] == 'application/json'
def test_https_logging_handler_connection_error(connection_error_adapter,
dummy_log_record):
handler = HTTPSHandler(host='127.0.0.1', message_type='logstash')
handler.setFormatter(LogstashFormatter())
handler.session.mount('http://', connection_error_adapter)
buff = StringIO()
logging.getLogger('awx.main.utils.handlers').addHandler(
logging.StreamHandler(buff)
)
async_futures = handler.emit(dummy_log_record)
with pytest.raises(requests.exceptions.ConnectionError):
[future.result() for future in async_futures]
assert 'failed to emit log to external aggregator\nTraceback' in buff.getvalue()
# we should only log failures *periodically*, so causing *another*
# immediate failure shouldn't report a second ConnectionError
buff.truncate(0)
async_futures = handler.emit(dummy_log_record)
with pytest.raises(requests.exceptions.ConnectionError):
[future.result() for future in async_futures]
assert buff.getvalue() == ''
@pytest.mark.parametrize('message_type', ['logstash', 'splunk'])
def test_https_logging_handler_emit_without_cred(https_adapter, dummy_log_record,
message_type):
handler = HTTPSHandler(host='127.0.0.1', message_type=message_type)
handler.setFormatter(LogstashFormatter())
handler.session.mount('https://', https_adapter)
async_futures = handler.emit(dummy_log_record)
[future.result() for future in async_futures]
assert len(https_adapter.requests) == 1
request = https_adapter.requests[0]
assert request.url == 'https://127.0.0.1/'
assert request.method == 'POST'
if message_type == 'logstash':
# A username + password weren't used, so this header should be missing
assert 'Authorization' not in request.headers
if message_type == 'splunk':
assert request.headers['Authorization'] == 'Splunk None'
def test_https_logging_handler_emit_logstash_with_creds(https_adapter,
dummy_log_record):
handler = HTTPSHandler(host='127.0.0.1',
username='user', password='pass',
message_type='logstash')
handler.setFormatter(LogstashFormatter())
handler.session.mount('https://', https_adapter)
async_futures = handler.emit(dummy_log_record)
[future.result() for future in async_futures]
assert len(https_adapter.requests) == 1
request = https_adapter.requests[0]
assert request.headers['Authorization'] == 'Basic %s' % smart_str(base64.b64encode(b"user:pass"))
def test_https_logging_handler_emit_splunk_with_creds(https_adapter,
dummy_log_record):
handler = HTTPSHandler(host='127.0.0.1',
password='pass', message_type='splunk')
handler.setFormatter(LogstashFormatter())
handler.session.mount('https://', https_adapter)
async_futures = handler.emit(dummy_log_record)
[future.result() for future in async_futures]
assert len(https_adapter.requests) == 1
request = https_adapter.requests[0]
assert request.headers['Authorization'] == 'Splunk pass'
@pytest.mark.parametrize('payload, encoded_payload', [
('foobar', 'foobar'),
({'foo': 'bar'}, '{"foo": "bar"}'),
({u'测试键': u'测试值'}, '{"测试键": "测试值"}'),
])
def test_encode_payload_for_socket(payload, encoded_payload):
assert _encode_payload_for_socket(payload).decode('utf-8') == encoded_payload
def test_udp_handler_create_socket_at_init():
handler = UDPHandler(host='127.0.0.1', port=4399)
assert hasattr(handler, 'socket')
assert isinstance(handler.socket, socket.socket)
assert handler.socket.family == socket.AF_INET
assert handler.socket.type == socket.SOCK_DGRAM
def test_udp_handler_send(dummy_log_record):
handler = UDPHandler(host='127.0.0.1', port=4399)
handler.setFormatter(LogstashFormatter())
with mock.patch('awx.main.utils.handlers._encode_payload_for_socket', return_value="des") as encode_mock,\
mock.patch.object(handler, 'socket') as socket_mock:
handler.emit(dummy_log_record)
encode_mock.assert_called_once_with(handler.format(dummy_log_record))
socket_mock.sendto.assert_called_once_with("des", ('127.0.0.1', 4399))
def test_tcp_handler_send(fake_socket, dummy_log_record):
handler = TCPHandler(host='127.0.0.1', port=4399, tcp_timeout=5)
handler.setFormatter(LogstashFormatter())
with mock.patch('socket.socket', return_value=fake_socket) as sok_init_mock,\
mock.patch('select.select', return_value=([], [fake_socket], [])):
handler.emit(dummy_log_record)
sok_init_mock.assert_called_once_with(socket.AF_INET, socket.SOCK_STREAM)
fake_socket.connect.assert_called_once_with(('127.0.0.1', 4399))
fake_socket.setblocking.assert_called_once_with(0)
fake_socket.send.assert_called_once_with(handler.format(dummy_log_record))
fake_socket.close.assert_called_once()
def test_tcp_handler_return_if_socket_unavailable(fake_socket, dummy_log_record):
handler = TCPHandler(host='127.0.0.1', port=4399, tcp_timeout=5)
handler.setFormatter(LogstashFormatter())
with mock.patch('socket.socket', return_value=fake_socket) as sok_init_mock,\
mock.patch('select.select', return_value=([], [], [])):
handler.emit(dummy_log_record)
sok_init_mock.assert_called_once_with(socket.AF_INET, socket.SOCK_STREAM)
fake_socket.connect.assert_called_once_with(('127.0.0.1', 4399))
fake_socket.setblocking.assert_called_once_with(0)
assert not fake_socket.send.called
fake_socket.close.assert_called_once()
def test_tcp_handler_log_exception(fake_socket, dummy_log_record):
handler = TCPHandler(host='127.0.0.1', port=4399, tcp_timeout=5)
handler.setFormatter(LogstashFormatter())
with mock.patch('socket.socket', return_value=fake_socket) as sok_init_mock,\
mock.patch('select.select', return_value=([], [], [])),\
mock.patch('awx.main.utils.handlers.logger') as logger_mock:
fake_socket.connect.side_effect = Exception("foo")
handler.emit(dummy_log_record)
sok_init_mock.assert_called_once_with(socket.AF_INET, socket.SOCK_STREAM)
logger_mock.exception.assert_called_once()
fake_socket.close.assert_called_once()
assert not fake_socket.send.called

View File

@@ -8,7 +8,7 @@ def test_produce_supervisor_command(mocker):
mock_process.communicate = communicate_mock
Popen_mock = mocker.MagicMock(return_value=mock_process)
with mocker.patch.object(reload.subprocess, 'Popen', Popen_mock):
reload._supervisor_service_command("restart")
reload.supervisor_service_command("restart")
reload.subprocess.Popen.assert_called_once_with(
['supervisorctl', 'restart', 'tower-processes:*',],
stderr=-1, stdin=-1, stdout=-1)

View File

@@ -0,0 +1,100 @@
import urllib.parse as urlparse
from django.conf import settings
from awx.main.utils.reload import supervisor_service_command
def construct_rsyslog_conf_template(settings=settings):
tmpl = ''
parts = []
enabled = getattr(settings, 'LOG_AGGREGATOR_ENABLED')
host = getattr(settings, 'LOG_AGGREGATOR_HOST', '')
port = getattr(settings, 'LOG_AGGREGATOR_PORT', '')
protocol = getattr(settings, 'LOG_AGGREGATOR_PROTOCOL', '')
timeout = getattr(settings, 'LOG_AGGREGATOR_TCP_TIMEOUT', 5)
max_bytes = settings.MAX_EVENT_RES_DATA
parts.extend([
'$WorkDirectory /var/lib/awx/rsyslog',
f'$MaxMessageSize {max_bytes}',
'$IncludeConfig /var/lib/awx/rsyslog/conf.d/*.conf',
'module(load="imuxsock" SysSock.Use="off")',
'input(type="imuxsock" Socket="' + settings.LOGGING['handlers']['external_logger']['address'] + '" unlink="on")',
'template(name="awx" type="string" string="%rawmsg-after-pri%")',
])
if not enabled:
parts.append('action(type="omfile" file="/dev/null")') # rsyslog needs *at least* one valid action to start
tmpl = '\n'.join(parts)
return tmpl
if protocol.startswith('http'):
scheme = 'https'
# urlparse requires '//' to be provided if scheme is not specified
original_parsed = urlparse.urlsplit(host)
if (not original_parsed.scheme and not host.startswith('//')) or original_parsed.hostname is None:
host = '%s://%s' % (scheme, host) if scheme else '//%s' % host
parsed = urlparse.urlsplit(host)
host = parsed.hostname
try:
if parsed.port:
port = parsed.port
except ValueError:
port = settings.LOG_AGGREGATOR_PORT
# https://github.com/rsyslog/rsyslog-doc/blob/master/source/configuration/modules/omhttp.rst
ssl = 'on' if parsed.scheme == 'https' else 'off'
skip_verify = 'off' if settings.LOG_AGGREGATOR_VERIFY_CERT else 'on'
if not port:
port = 443 if parsed.scheme == 'https' else 80
params = [
'type="omhttp"',
f'server="{host}"',
f'serverport="{port}"',
f'usehttps="{ssl}"',
f'skipverifyhost="{skip_verify}"',
'action.resumeRetryCount="-1"',
'template="awx"',
'errorfile="/var/log/tower/rsyslog.err"',
f'action.resumeInterval="{timeout}"'
]
if parsed.path:
path = urlparse.quote(parsed.path[1:])
if parsed.query:
path = f'{path}?{urlparse.quote(parsed.query)}'
params.append(f'restpath="{path}"')
username = getattr(settings, 'LOG_AGGREGATOR_USERNAME', '')
password = getattr(settings, 'LOG_AGGREGATOR_PASSWORD', '')
if getattr(settings, 'LOG_AGGREGATOR_TYPE', None) == 'splunk':
# splunk has a weird authorization header <shrug>
if password:
# from omhttp docs:
# https://www.rsyslog.com/doc/v8-stable/configuration/modules/omhttp.html
# > Currently only a single additional header/key pair is
# > configurable, further development is needed to support
# > arbitrary header key/value lists.
params.append('httpheaderkey="Authorization"')
params.append(f'httpheadervalue="Splunk {password}"')
elif username:
params.append(f'uid="{username}"')
if password:
# you can only have a basic auth password if there's a username
params.append(f'pwd="{password}"')
params = ' '.join(params)
parts.extend(['module(load="omhttp")', f'action({params})'])
elif protocol and host and port:
parts.append(
f'action(type="omfwd" target="{host}" port="{port}" protocol="{protocol}" action.resumeRetryCount="-1" action.resumeInterval="{timeout}" template="awx")' # noqa
)
else:
parts.append('action(type="omfile" file="/dev/null")') # rsyslog needs *at least* one valid action to start
tmpl = '\n'.join(parts)
return tmpl
def reconfigure_rsyslog():
tmpl = construct_rsyslog_conf_template()
with open('/var/lib/awx/rsyslog/rsyslog.conf', 'w') as f:
f.write(tmpl + '\n')
supervisor_service_command(command='restart', service='awx-rsyslogd')

View File

@@ -3,13 +3,13 @@
from copy import copy
import json
import time
import logging
import traceback
import socket
from datetime import datetime
from dateutil.tz import tzutc
from django.core.serializers.json import DjangoJSONEncoder
from django.conf import settings
@@ -91,18 +91,13 @@ class LogstashFormatterBase(logging.Formatter):
'processName': record.processName,
}
@classmethod
def format_timestamp(cls, time):
tstamp = datetime.utcfromtimestamp(time)
return tstamp.strftime("%Y-%m-%dT%H:%M:%S") + ".%03d" % (tstamp.microsecond / 1000) + "Z"
@classmethod
def format_exception(cls, exc_info):
return ''.join(traceback.format_exception(*exc_info)) if exc_info else ''
@classmethod
def serialize(cls, message):
return bytes(json.dumps(message), 'utf-8')
return json.dumps(message, cls=DjangoJSONEncoder) + '\n'
class LogstashFormatter(LogstashFormatterBase):
@@ -157,9 +152,6 @@ class LogstashFormatter(LogstashFormatterBase):
try:
data_for_log[key] = getattr(job_event, fd)
if fd in ['created', 'modified'] and data_for_log[key] is not None:
time_float = time.mktime(data_for_log[key].timetuple())
data_for_log[key] = self.format_timestamp(time_float)
except Exception as e:
data_for_log[key] = 'Exception `{}` producing field'.format(e)
@@ -231,10 +223,12 @@ class LogstashFormatter(LogstashFormatterBase):
return fields
def format(self, record):
stamp = datetime.utcfromtimestamp(record.created)
stamp = stamp.replace(tzinfo=tzutc())
message = {
# Field not included, but exist in related logs
# 'path': record.pathname
'@timestamp': self.format_timestamp(record.created),
'@timestamp': stamp,
'message': record.getMessage(),
'host': self.host,
@@ -250,4 +244,7 @@ class LogstashFormatter(LogstashFormatterBase):
if record.exc_info:
message.update(self.get_debug_fields(record))
if settings.LOG_AGGREGATOR_TYPE == 'splunk':
# splunk messages must have a top level "event" key
message = {'event': message}
return self.serialize(message)

View File

@@ -3,404 +3,29 @@
# Python
import logging
import json
import os
import requests
import time
import threading
import socket
import select
from urllib import parse as urlparse
from concurrent.futures import ThreadPoolExecutor
from requests.exceptions import RequestException
import os.path
# Django
from django.conf import settings
# requests futures, a dependency used by these handlers
from requests_futures.sessions import FuturesSession
import cachetools
# AWX
from awx.main.utils.formatters import LogstashFormatter
class RSysLogHandler(logging.handlers.SysLogHandler):
append_nul = False
__all__ = ['BaseHTTPSHandler', 'TCPHandler', 'UDPHandler',
'AWXProxyHandler']
logger = logging.getLogger('awx.main.utils.handlers')
# Translation of parameter names to names in Django settings
# logging settings category, only those related to handler / log emission
PARAM_NAMES = {
'host': 'LOG_AGGREGATOR_HOST',
'port': 'LOG_AGGREGATOR_PORT',
'message_type': 'LOG_AGGREGATOR_TYPE',
'username': 'LOG_AGGREGATOR_USERNAME',
'password': 'LOG_AGGREGATOR_PASSWORD',
'indv_facts': 'LOG_AGGREGATOR_INDIVIDUAL_FACTS',
'tcp_timeout': 'LOG_AGGREGATOR_TCP_TIMEOUT',
'verify_cert': 'LOG_AGGREGATOR_VERIFY_CERT',
'protocol': 'LOG_AGGREGATOR_PROTOCOL'
}
def unused_callback(sess, resp):
pass
class LoggingConnectivityException(Exception):
pass
class VerboseThreadPoolExecutor(ThreadPoolExecutor):
last_log_emit = 0
def submit(self, func, *args, **kwargs):
def _wrapped(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception:
# If an exception occurs in a concurrent thread worker (like
# a ConnectionError or a read timeout), periodically log
# that failure.
#
# This approach isn't really thread-safe, so we could
# potentially log once per thread every 10 seconds, but it
# beats logging *every* failed HTTP request in a scenario where
# you've typo'd your log aggregator hostname.
now = time.time()
if now - self.last_log_emit > 10:
logger.exception('failed to emit log to external aggregator')
self.last_log_emit = now
raise
return super(VerboseThreadPoolExecutor, self).submit(_wrapped, *args,
**kwargs)
class SocketResult:
'''
A class to be the return type of methods that send data over a socket
allows object to be used in the same way as a request futures object
'''
def __init__(self, ok, reason=None):
self.ok = ok
self.reason = reason
def result(self):
return self
class BaseHandler(logging.Handler):
def __init__(self, host=None, port=None, indv_facts=None, **kwargs):
super(BaseHandler, self).__init__()
self.host = host
self.port = port
self.indv_facts = indv_facts
def _send(self, payload):
"""Actually send message to log aggregator.
"""
return payload
def _format_and_send_record(self, record):
if self.indv_facts:
return [self._send(json.loads(self.format(record)))]
return [self._send(self.format(record))]
def emit(self, record):
"""
Emit a log record. Returns a list of zero or more
implementation-specific objects for tests.
"""
def emit(self, msg):
if not settings.LOG_AGGREGATOR_ENABLED:
return
if not os.path.exists(settings.LOGGING['handlers']['external_logger']['address']):
return
try:
return self._format_and_send_record(record)
except (KeyboardInterrupt, SystemExit):
raise
except Exception:
self.handleError(record)
def _get_host(self, scheme='', hostname_only=False):
"""Return the host name of log aggregator.
"""
host = self.host or ''
# urlparse requires '//' to be provided if scheme is not specified
original_parsed = urlparse.urlsplit(host)
if (not original_parsed.scheme and not host.startswith('//')) or original_parsed.hostname is None:
host = '%s://%s' % (scheme, host) if scheme else '//%s' % host
parsed = urlparse.urlsplit(host)
if hostname_only:
return parsed.hostname
try:
port = parsed.port or self.port
except ValueError:
port = self.port
netloc = parsed.netloc if port is None else '%s:%s' % (parsed.hostname, port)
url_components = list(parsed)
url_components[1] = netloc
ret = urlparse.urlunsplit(url_components)
return ret.lstrip('/')
class BaseHTTPSHandler(BaseHandler):
'''
Originally derived from python-logstash library
Non-blocking request accomplished by FuturesSession, similar
to the loggly-python-handler library
'''
def _add_auth_information(self):
if self.message_type == 'logstash':
if not self.username:
# Logstash authentication not enabled
return
logstash_auth = requests.auth.HTTPBasicAuth(self.username, self.password)
self.session.auth = logstash_auth
elif self.message_type == 'splunk':
auth_header = "Splunk %s" % self.password
headers = {
"Authorization": auth_header,
"Content-Type": "application/json"
}
self.session.headers.update(headers)
def __init__(self, fqdn=False, message_type=None, username=None, password=None,
tcp_timeout=5, verify_cert=True, **kwargs):
self.fqdn = fqdn
self.message_type = message_type
self.username = username
self.password = password
self.tcp_timeout = tcp_timeout
self.verify_cert = verify_cert
super(BaseHTTPSHandler, self).__init__(**kwargs)
self.session = FuturesSession(executor=VerboseThreadPoolExecutor(
max_workers=2 # this is the default used by requests_futures
))
self._add_auth_information()
def _get_post_kwargs(self, payload_input):
if self.message_type == 'splunk':
# Splunk needs data nested under key "event"
if not isinstance(payload_input, dict):
payload_input = json.loads(payload_input)
payload_input = {'event': payload_input}
if isinstance(payload_input, dict):
payload_str = json.dumps(payload_input)
else:
payload_str = payload_input
kwargs = dict(data=payload_str, background_callback=unused_callback,
timeout=self.tcp_timeout)
if self.verify_cert is False:
kwargs['verify'] = False
return kwargs
def _send(self, payload):
"""See:
https://docs.python.org/3/library/concurrent.futures.html#future-objects
http://pythonhosted.org/futures/
"""
return self.session.post(self._get_host(scheme='https'),
**self._get_post_kwargs(payload))
def _encode_payload_for_socket(payload):
encoded_payload = payload
if isinstance(encoded_payload, dict):
encoded_payload = json.dumps(encoded_payload, ensure_ascii=False)
if isinstance(encoded_payload, str):
encoded_payload = encoded_payload.encode('utf-8')
return encoded_payload
class TCPHandler(BaseHandler):
def __init__(self, tcp_timeout=5, **kwargs):
self.tcp_timeout = tcp_timeout
super(TCPHandler, self).__init__(**kwargs)
def _send(self, payload):
payload = _encode_payload_for_socket(payload)
sok = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
sok.connect((self._get_host(hostname_only=True), self.port or 0))
sok.setblocking(0)
_, ready_to_send, _ = select.select([], [sok], [], float(self.tcp_timeout))
if len(ready_to_send) == 0:
ret = SocketResult(False, "Socket currently busy, failed to send message")
logger.warning(ret.reason)
else:
sok.send(payload)
ret = SocketResult(True) # success!
except Exception as e:
ret = SocketResult(False, "Error sending message from %s: %s" %
(TCPHandler.__name__,
' '.join(str(arg) for arg in e.args)))
logger.exception(ret.reason)
finally:
sok.close()
return ret
class UDPHandler(BaseHandler):
message = "Cannot determine if UDP messages are received."
def __init__(self, **kwargs):
super(UDPHandler, self).__init__(**kwargs)
self.socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
def _send(self, payload):
payload = _encode_payload_for_socket(payload)
self.socket.sendto(payload, (self._get_host(hostname_only=True), self.port or 0))
return SocketResult(True, reason=self.message)
class AWXNullHandler(logging.NullHandler):
'''
Only additional this does is accept arbitrary __init__ params because
the proxy handler does not (yet) work with arbitrary handler classes
'''
def __init__(self, *args, **kwargs):
super(AWXNullHandler, self).__init__()
HANDLER_MAPPING = {
'https': BaseHTTPSHandler,
'tcp': TCPHandler,
'udp': UDPHandler,
}
TTLCache = cachetools.TTLCache
if 'py.test' in os.environ.get('_', ''):
# don't cache settings in unit tests
class TTLCache(TTLCache):
def __getitem__(self, item):
raise KeyError()
class AWXProxyHandler(logging.Handler):
'''
Handler specific to the AWX external logging feature
Will dynamically create a handler specific to the configured
protocol, and will create a new one automatically on setting change
Managing parameters:
All parameters will get their value from settings as a default
if the parameter was either provided on init, or set manually,
this value will take precedence.
Parameters match same parameters in the actualized handler classes.
'''
thread_local = threading.local()
_auditor = None
def __init__(self, **kwargs):
# TODO: process 'level' kwarg
super(AWXProxyHandler, self).__init__(**kwargs)
self._handler = None
self._old_kwargs = {}
@property
def auditor(self):
if not self._auditor:
self._auditor = logging.handlers.RotatingFileHandler(
filename='/var/log/tower/external.log',
maxBytes=1024 * 1024 * 50, # 50 MB
backupCount=5,
)
class WritableLogstashFormatter(LogstashFormatter):
@classmethod
def serialize(cls, message):
return json.dumps(message)
self._auditor.setFormatter(WritableLogstashFormatter())
return self._auditor
def get_handler_class(self, protocol):
return HANDLER_MAPPING.get(protocol, AWXNullHandler)
@cachetools.cached(cache=TTLCache(maxsize=1, ttl=3), key=lambda *args, **kw: 'get_handler')
def get_handler(self, custom_settings=None, force_create=False):
new_kwargs = {}
use_settings = custom_settings or settings
for field_name, setting_name in PARAM_NAMES.items():
val = getattr(use_settings, setting_name, None)
if val is None:
continue
new_kwargs[field_name] = val
if new_kwargs == self._old_kwargs and self._handler and (not force_create):
# avoids re-creating session objects, and other such things
return self._handler
self._old_kwargs = new_kwargs.copy()
# TODO: remove any kwargs no applicable to that particular handler
protocol = new_kwargs.pop('protocol', None)
HandlerClass = self.get_handler_class(protocol)
# cleanup old handler and make new one
if self._handler:
self._handler.close()
logger.debug('Creating external log handler due to startup or settings change.')
self._handler = HandlerClass(**new_kwargs)
if self.formatter:
# self.format(record) is called inside of emit method
# so not safe to assume this can be handled within self
self._handler.setFormatter(self.formatter)
return self._handler
@cachetools.cached(cache=TTLCache(maxsize=1, ttl=3), key=lambda *args, **kw: 'should_audit')
def should_audit(self):
return settings.LOG_AGGREGATOR_AUDIT
def emit(self, record):
if AWXProxyHandler.thread_local.enabled:
actual_handler = self.get_handler()
if self.should_audit():
self.auditor.setLevel(settings.LOG_AGGREGATOR_LEVEL)
self.auditor.emit(record)
return actual_handler.emit(record)
def perform_test(self, custom_settings):
"""
Tests logging connectivity for given settings module.
@raises LoggingConnectivityException
"""
handler = self.get_handler(custom_settings=custom_settings, force_create=True)
handler.setFormatter(LogstashFormatter())
logger = logging.getLogger(__file__)
fn, lno, func, _ = logger.findCaller()
record = logger.makeRecord('awx', 10, fn, lno,
'AWX Connection Test', tuple(),
None, func)
futures = handler.emit(record)
for future in futures:
try:
resp = future.result()
if not resp.ok:
if isinstance(resp, SocketResult):
raise LoggingConnectivityException(
'Socket error: {}'.format(resp.reason or '')
)
else:
raise LoggingConnectivityException(
': '.join([str(resp.status_code), resp.reason or ''])
)
except RequestException as e:
raise LoggingConnectivityException(str(e))
@classmethod
def disable(cls):
cls.thread_local.enabled = False
AWXProxyHandler.thread_local.enabled = True
return super(RSysLogHandler, self).emit(msg)
except ConnectionRefusedError:
# rsyslogd has gone to lunch; this generally means that it's just
# been restarted (due to a configuration change)
# unfortunately, we can't log that because...rsyslogd is down (and
# would just us back ddown this code path)
pass
ColorHandler = logging.StreamHandler

View File

@@ -4,25 +4,24 @@
# Python
import subprocess
import logging
import os
# Django
from django.conf import settings
logger = logging.getLogger('awx.main.utils.reload')
def _supervisor_service_command(command, communicate=True):
def supervisor_service_command(command, service='*', communicate=True):
'''
example use pattern of supervisorctl:
# supervisorctl restart tower-processes:receiver tower-processes:factcacher
'''
group_name = 'tower-processes'
if settings.DEBUG:
group_name = 'awx-processes'
args = ['supervisorctl']
if settings.DEBUG:
args.extend(['-c', '/supervisor.conf'])
args.extend([command, '{}:*'.format(group_name)])
supervisor_config_path = os.getenv('SUPERVISOR_WEB_CONFIG_PATH', None)
if supervisor_config_path:
args.extend(['-c', supervisor_config_path])
args.extend([command, ':'.join(['tower-processes', service])])
logger.debug('Issuing command to {} services, args={}'.format(command, args))
supervisor_process = subprocess.Popen(args, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
@@ -30,15 +29,16 @@ def _supervisor_service_command(command, communicate=True):
restart_stdout, restart_err = supervisor_process.communicate()
restart_code = supervisor_process.returncode
if restart_code or restart_err:
logger.error('supervisorctl {} errored with exit code `{}`, stdout:\n{}stderr:\n{}'.format(
command, restart_code, restart_stdout.strip(), restart_err.strip()))
logger.error('supervisorctl {} {} errored with exit code `{}`, stdout:\n{}stderr:\n{}'.format(
command, service, restart_code, restart_stdout.strip(), restart_err.strip()))
else:
logger.info('supervisorctl {} finished, stdout:\n{}'.format(
command, restart_stdout.strip()))
logger.debug(
'supervisorctl {} {} succeeded'.format(command, service)
)
else:
logger.info('Submitted supervisorctl {} command, not waiting for result'.format(command))
def stop_local_services(communicate=True):
logger.warn('Stopping services on this node in response to user action')
_supervisor_service_command(command='stop', communicate=communicate)
supervisor_service_command(command='stop', communicate=communicate)

View File

@@ -93,7 +93,7 @@ class WebsocketTask():
try:
async with aiohttp.ClientSession(headers={'secret': secret_val},
timeout=timeout) as session:
async with session.ws_connect(uri, ssl=self.verify_ssl) as websocket:
async with session.ws_connect(uri, ssl=self.verify_ssl, heartbeat=20) as websocket:
self.stats.record_connection_established()
attempt = 0
await self.run_loop(websocket)
@@ -105,17 +105,14 @@ class WebsocketTask():
raise
except client_exceptions.ClientConnectorError as e:
logger.warn(f"Failed to connect to {self.remote_host}: '{e}'. Reconnecting ...")
self.stats.record_connection_lost()
self.start(attempt=attempt + 1)
except asyncio.TimeoutError:
logger.warn(f"Timeout while trying to connect to {self.remote_host}. Reconnecting ...")
self.stats.record_connection_lost()
self.start(attempt=attempt + 1)
except Exception as e:
# Early on, this is our canary. I'm not sure what exceptions we can really encounter.
logger.warn(f"Websocket broadcast client exception {type(e)} {e}")
self.stats.record_connection_lost()
self.start(attempt=attempt + 1)
self.stats.record_connection_lost()
self.start(attempt=attempt + 1)
def start(self, attempt=0):
self.async_task = self.event_loop.create_task(self.connect(attempt=attempt))

View File

@@ -144,6 +144,7 @@
changed_when: "'was installed successfully' in galaxy_result.stdout"
environment:
ANSIBLE_FORCE_COLOR: false
GIT_SSH_COMMAND: "ssh -o StrictHostKeyChecking=no"
when: roles_enabled|bool
tags:
@@ -165,6 +166,7 @@
environment:
ANSIBLE_FORCE_COLOR: false
ANSIBLE_COLLECTIONS_PATHS: "{{ collections_destination }}"
GIT_SSH_COMMAND: "ssh -o StrictHostKeyChecking=no"
when:
- "ansible_version.full is version_compare('2.8', '>=')"

View File

@@ -405,17 +405,6 @@ AWX_ISOLATED_PERIODIC_CHECK = 600
# Verbosity level for isolated node management tasks
AWX_ISOLATED_VERBOSITY = 0
# Memcached django cache configuration
# CACHES = {
# 'default': {
# 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
# 'LOCATION': '127.0.0.1:11211',
# 'TIMEOUT': 864000,
# 'KEY_PREFIX': 'tower_dev',
# }
# }
DEVSERVER_DEFAULT_ADDR = '0.0.0.0'
DEVSERVER_DEFAULT_PORT = '8013'
@@ -458,7 +447,7 @@ CELERYBEAT_SCHEDULE = {
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': 'memcached:11211',
'LOCATION': 'unix:/var/run/memcached/memcached.sock'
},
}
@@ -1022,8 +1011,9 @@ LOGGING = {
'formatter': 'simple',
},
'external_logger': {
'class': 'awx.main.utils.handlers.AWXProxyHandler',
'class': 'awx.main.utils.handlers.RSysLogHandler',
'formatter': 'json',
'address': '/var/run/awx-rsyslog/rsyslog.sock',
'filters': ['external_log_enabled', 'dynamic_level_filter'],
},
'tower_warnings': {
@@ -1053,6 +1043,15 @@ LOGGING = {
'backupCount': 5,
'formatter':'dispatcher',
},
'wsbroadcast': {
# don't define a level here, it's set by settings.LOG_AGGREGATOR_LEVEL
'class': 'logging.handlers.RotatingFileHandler',
'filters': ['require_debug_false', 'dynamic_level_filter'],
'filename': os.path.join(LOG_ROOT, 'wsbroadcast.log'),
'maxBytes': 1024 * 1024 * 5, # 5 MB
'backupCount': 5,
'formatter':'simple',
},
'celery.beat': {
'class':'logging.StreamHandler',
'level': 'ERROR'
@@ -1140,6 +1139,9 @@ LOGGING = {
'awx.main.dispatch': {
'handlers': ['dispatcher'],
},
'awx.main.wsbroadcast': {
'handlers': ['wsbroadcast'],
},
'awx.isolated.manager.playbooks': {
'handlers': ['management_playbooks'],
'propagate': False
@@ -1229,7 +1231,6 @@ MIDDLEWARE = [
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'awx.main.middleware.ActivityStreamMiddleware',
'awx.sso.middleware.SocialAuthMiddleware',
'crum.CurrentRequestUserMiddleware',
'awx.main.middleware.URLModificationMiddleware',

View File

@@ -40,7 +40,6 @@ NOTEBOOK_ARGUMENTS = [
]
# print SQL queries in shell_plus
SHELL_PLUS = 'ipython'
SHELL_PLUS_PRINT_SQL = False
# show colored logs in the dev environment

View File

@@ -55,6 +55,7 @@ AWX_ISOLATED_USERNAME = 'awx'
LOGGING['handlers']['tower_warnings']['filename'] = '/var/log/tower/tower.log' # noqa
LOGGING['handlers']['callback_receiver']['filename'] = '/var/log/tower/callback_receiver.log' # noqa
LOGGING['handlers']['dispatcher']['filename'] = '/var/log/tower/dispatcher.log' # noqa
LOGGING['handlers']['wsbroadcast']['filename'] = '/var/log/tower/wsbroadcast.log' # noqa
LOGGING['handlers']['task_system']['filename'] = '/var/log/tower/task_system.log' # noqa
LOGGING['handlers']['management_playbooks']['filename'] = '/var/log/tower/management_playbooks.log' # noqa
LOGGING['handlers']['system_tracking_migrations']['filename'] = '/var/log/tower/tower_system_tracking_migrations.log' # noqa

View File

@@ -29,7 +29,7 @@
<div class="at-Row-container">
<div class="at-Row-container">
<at-row-item
header-value="{{ approval.summary_fields.source_workflow_job.name }}"
header-value="{{ approval.summary_fields.source_workflow_job.id }} - {{ approval.summary_fields.source_workflow_job.name }}"
header-state="workflowResults({id: {{approval.summary_fields.source_workflow_job.id}}})">
</at-row-item>
</div>

View File

@@ -92,6 +92,7 @@ export default [
var populateFromApi = function() {
SettingsService.getCurrentValues()
.then(function(data) {
$scope.logAggregatorEnabled = data.LOG_AGGREGATOR_ENABLED;
// these two values need to be unnested from the
// OAUTH2_PROVIDER key
data.ACCESS_TOKEN_EXPIRE_SECONDS = data
@@ -538,8 +539,11 @@ export default [
var payload = {};
payload[key] = $scope[key];
SettingsService.patchConfiguration(payload)
.then(function() {
.then(function(data) {
//TODO consider updating form values with returned data here
if (key === 'LOG_AGGREGATOR_ENABLED') {
$scope.logAggregatorEnabled = data.LOG_AGGREGATOR_ENABLED;
}
})
.catch(function(data) {
//Change back on unsuccessful update

View File

@@ -17,7 +17,7 @@ export default [
'ProcessErrors',
'ngToast',
'$filter',
function(
function (
$rootScope, $scope, $stateParams,
systemActivityStreamForm,
systemLoggingForm,
@@ -41,8 +41,8 @@ export default [
formTracker.setCurrentSystem(activeSystemForm);
}
var activeForm = function(tab) {
if(!_.get($scope.$parent, [formTracker.currentFormName(), '$dirty'])) {
var activeForm = function (tab) {
if (!_.get($scope.$parent, [formTracker.currentFormName(), '$dirty'])) {
systemVm.activeSystemForm = tab;
formTracker.setCurrentSystem(systemVm.activeSystemForm);
} else {
@@ -52,7 +52,7 @@ export default [
label: i18n._('Discard changes'),
"class": "btn Form-cancelButton",
"id": "formmodal-cancel-button",
onClick: function() {
onClick: function () {
$scope.$parent.vm.populateFromApi();
$scope.$parent[formTracker.currentFormName()].$setPristine();
systemVm.activeSystemForm = tab;
@@ -61,15 +61,15 @@ export default [
}
}, {
label: i18n._('Save changes'),
onClick: function() {
onClick: function () {
$scope.$parent.vm.formSave()
.then(function() {
$scope.$parent[formTracker.currentFormName()].$setPristine();
$scope.$parent.vm.populateFromApi();
systemVm.activeSystemForm = tab;
formTracker.setCurrentSystem(systemVm.activeSystemForm);
$('#FormModal-dialog').dialog('close');
});
.then(function () {
$scope.$parent[formTracker.currentFormName()].$setPristine();
$scope.$parent.vm.populateFromApi();
systemVm.activeSystemForm = tab;
formTracker.setCurrentSystem(systemVm.activeSystemForm);
$('#FormModal-dialog').dialog('close');
});
},
"class": "btn btn-primary",
"id": "formmodal-save-button"
@@ -80,9 +80,9 @@ export default [
};
var dropdownOptions = [
{label: i18n._('Misc. System'), value: 'misc'},
{label: i18n._('Activity Stream'), value: 'activity_stream'},
{label: i18n._('Logging'), value: 'logging'},
{ label: i18n._('Misc. System'), value: 'misc' },
{ label: i18n._('Activity Stream'), value: 'activity_stream' },
{ label: i18n._('Logging'), value: 'logging' },
];
var systemForms = [{
@@ -97,14 +97,14 @@ export default [
}];
var forms = _.map(systemForms, 'formDef');
_.each(forms, function(form) {
_.each(forms, function (form) {
var keys = _.keys(form.fields);
_.each(keys, function(key) {
if($scope.configDataResolve[key].type === 'choice') {
_.each(keys, function (key) {
if ($scope.configDataResolve[key].type === 'choice') {
// Create options for dropdowns
var optionsGroup = key + '_options';
$scope.$parent.$parent[optionsGroup] = [];
_.each($scope.configDataResolve[key].choices, function(choice){
_.each($scope.configDataResolve[key].choices, function (choice) {
$scope.$parent.$parent[optionsGroup].push({
name: choice[0],
label: choice[1],
@@ -121,7 +121,7 @@ export default [
function addFieldInfo(form, key) {
_.extend(form.fields[key], {
awPopOver: ($scope.configDataResolve[key].defined_in_file) ?
null: $scope.configDataResolve[key].help_text,
null : $scope.configDataResolve[key].help_text,
label: $scope.configDataResolve[key].label,
name: key,
toggleSource: key,
@@ -138,7 +138,7 @@ export default [
$scope.$parent.$parent.parseType = 'json';
_.each(systemForms, function(form) {
_.each(systemForms, function (form) {
generator.inject(form.formDef, {
id: form.id,
mode: 'edit',
@@ -150,37 +150,37 @@ export default [
var dropdownRendered = false;
$scope.$on('populated', function() {
$scope.$on('populated', function () {
populateLogAggregator(false);
});
$scope.$on('LOG_AGGREGATOR_TYPE_populated', function(e, data, flag) {
$scope.$on('LOG_AGGREGATOR_TYPE_populated', function (e, data, flag) {
populateLogAggregator(flag);
});
$scope.$on('LOG_AGGREGATOR_PROTOCOL_populated', function(e, data, flag) {
$scope.$on('LOG_AGGREGATOR_PROTOCOL_populated', function (e, data, flag) {
populateLogAggregator(flag);
});
function populateLogAggregator(flag){
function populateLogAggregator(flag) {
if($scope.$parent.$parent.LOG_AGGREGATOR_TYPE !== null) {
if ($scope.$parent.$parent.LOG_AGGREGATOR_TYPE !== null) {
$scope.$parent.$parent.LOG_AGGREGATOR_TYPE = _.find($scope.$parent.$parent.LOG_AGGREGATOR_TYPE_options, { value: $scope.$parent.$parent.LOG_AGGREGATOR_TYPE });
}
if($scope.$parent.$parent.LOG_AGGREGATOR_PROTOCOL !== null) {
if ($scope.$parent.$parent.LOG_AGGREGATOR_PROTOCOL !== null) {
$scope.$parent.$parent.LOG_AGGREGATOR_PROTOCOL = _.find($scope.$parent.$parent.LOG_AGGREGATOR_PROTOCOL_options, { value: $scope.$parent.$parent.LOG_AGGREGATOR_PROTOCOL });
}
if($scope.$parent.$parent.LOG_AGGREGATOR_LEVEL !== null) {
if ($scope.$parent.$parent.LOG_AGGREGATOR_LEVEL !== null) {
$scope.$parent.$parent.LOG_AGGREGATOR_LEVEL = _.find($scope.$parent.$parent.LOG_AGGREGATOR_LEVEL_options, { value: $scope.$parent.$parent.LOG_AGGREGATOR_LEVEL });
}
if(flag !== undefined){
if (flag !== undefined) {
dropdownRendered = flag;
}
if(!dropdownRendered) {
if (!dropdownRendered) {
dropdownRendered = true;
CreateSelect2({
element: '#configuration_logging_template_LOG_AGGREGATOR_TYPE',
@@ -193,33 +193,52 @@ export default [
}
}
$scope.$parent.vm.testLogging = function() {
Rest.setUrl("/api/v2/settings/logging/test/");
Rest.post($scope.$parent.vm.getFormPayload())
.then(() => {
ngToast.success({
content: `<i class="fa fa-check-circle
Toast-successIcon"></i>` +
i18n._('Log aggregator test successful.')
});
})
.catch(({data, status}) => {
if (status === 500) {
ngToast.danger({
content: '<i class="fa fa-exclamation-triangle Toast-successIcon"></i>' +
i18n._('Log aggregator test failed. <br> Detail: ') + $filter('sanitize')(data.error),
additionalClasses: "LogAggregator-failedNotification"
$scope.$watchGroup(['configuration_logging_template_form.$pending', 'configuration_logging_template_form.$dirty', '!logAggregatorEnabled'], (vals) => {
if (vals.some(val => val === true)) {
$scope.$parent.vm.disableTestButton = true;
$scope.$parent.vm.testTooltip = i18n._('Save and enable log aggregation before testing the log aggregator.');
} else {
$scope.$parent.vm.disableTestButton = false;
$scope.$parent.vm.testTooltip = i18n._('Send a test log message to the configured log aggregator.');
}
});
$scope.$parent.vm.testLogging = function () {
if (!$scope.$parent.vm.disableTestButton) {
$scope.$parent.vm.disableTestButton = true;
Rest.setUrl("/api/v2/settings/logging/test/");
Rest.post({})
.then(() => {
$scope.$parent.vm.disableTestButton = false;
ngToast.success({
dismissButton: false,
dismissOnTimeout: true,
content: `<i class="fa fa-check-circle
Toast-successIcon"></i>` +
i18n._('Log aggregator test sent successfully.')
});
} else {
ProcessErrors($scope, data, status, null,
{
hdr: i18n._('Error!'),
msg: i18n._('There was an error testing the ' +
'log aggregator. Returned status: ') +
status
})
.catch(({ data, status }) => {
$scope.$parent.vm.disableTestButton = false;
if (status === 400 || status === 500) {
ngToast.danger({
dismissButton: false,
dismissOnTimeout: true,
content: '<i class="fa fa-exclamation-triangle Toast-successIcon"></i>' +
i18n._('Log aggregator test failed. <br> Detail: ') + $filter('sanitize')(data.error),
additionalClasses: "LogAggregator-failedNotification"
});
}
});
} else {
ProcessErrors($scope, data, status, null,
{
hdr: i18n._('Error!'),
msg: i18n._('There was an error testing the ' +
'log aggregator. Returned status: ') +
status
});
}
});
}
};
angular.extend(systemVm, {

View File

@@ -75,10 +75,13 @@
class: 'Form-resetAll'
},
testLogging: {
ngClass: "{'Form-button--disabled': vm.disableTestButton}",
ngClick: 'vm.testLogging()',
label: i18n._('Test'),
class: 'btn-primary',
ngDisabled: 'configuration_logging_template_form.$invalid'
class: 'Form-primaryButton',
awToolTip: '{{vm.testTooltip}}',
dataTipWatch: 'vm.testTooltip',
dataPlacement: 'top',
},
cancel: {
ngClick: 'vm.formCancel()',

View File

@@ -657,10 +657,10 @@ function(SettingsUtils, i18n, $rootScope) {
query += '&credential_type__namespace=ssh&role_level=use_role';
break;
case 'scm_credential':
query += '&redential_type__namespace=scm&role_level=use_role';
query += '&credential_type__namespace=scm&role_level=use_role';
break;
case 'network_credential':
query += '&redential_type__namespace=net&role_level=use_role';
query += '&credential_type__namespace=net&role_level=use_role';
break;
case 'cloud_credential':
query += '&cloud=true&role_level=use_role';

View File

@@ -1690,6 +1690,9 @@ angular.module('FormGenerator', [GeneratorHelpers.name, 'Utilities', listGenerat
if (button.ngClick) {
html += this.attr(button, 'ngClick');
}
if (button.ngClass) {
html += this.attr(button, 'ngClass');
}
if (button.ngDisabled) {
ngDisabled = (button.ngDisabled===true) ? `${this.form.name}_form.$invalid || ${this.form.name}_form.$pending`: button.ngDisabled;
if (btn !== 'reset') {

View File

@@ -34,9 +34,9 @@
<div class="List-tableHeader--info col-md-4" base-path="unified_job_templates" collection="wf_maker_templates" dataset="wf_maker_template_dataset" column-sort="" column-field="info" column-iterator="wf_maker_template" column-no-sort="true" column-label="" column-custom-class="" query-set="wf_maker_template_queryset"></div>
</div>
</div>
<div ng-class="[template.success_class, {'List-tableRow--selected' : $stateParams['template_id'] == wf_maker_template.id}, {'List-tableRow--disabled': !wf_maker_template.summary_fields.user_capabilities.edit}]" id="{{ wf_maker_template.id }}" class="List-lookupLayout List-tableRow template_class" disable-row="{{ !wf_maker_template.summary_fields.user_capabilities.edit }}" ng-repeat="wf_maker_template in wf_maker_templates">
<div ng-class="[template.success_class, {'List-tableRow--selected' : $stateParams['template_id'] == wf_maker_template.id}, {'List-tableRow--disabled': !wf_maker_template.summary_fields.user_capabilities.start}]" id="{{ wf_maker_template.id }}" class="List-lookupLayout List-tableRow template_class" disable-row="{{ !wf_maker_template.summary_fields.user_capabilities.start }}" ng-repeat="wf_maker_template in wf_maker_templates">
<div class="List-centerEnd select-column">
<input type="radio" ng-model="wf_maker_template.checked" ng-value="1" ng-false-value="0" name="check_template_{{wf_maker_template.id}}" ng-click="selectTemplate(wf_maker_template)" ng-disabled="!wf_maker_template.summary_fields.user_capabilities.edit">
<input type="radio" ng-model="wf_maker_template.checked" ng-value="1" ng-false-value="0" name="check_template_{{wf_maker_template.id}}" ng-click="selectTemplate(wf_maker_template)" ng-disabled="!wf_maker_template.summary_fields.user_capabilities.start">
</div>
<div class="d-flex h-100">
<div class="List-tableCell name-column col-md-8" ng-click="selectTemplate(wf_maker_template)">

View File

@@ -14,6 +14,7 @@ Have questions about this document or anything not covered here? Feel free to re
* [Accessing the AWX web interface](#accessing-the-awx-web-interface)
* [AWX REST API Interaction](#awx-rest-api-interaction)
* [Handling API Errors](#handling-api-errors)
* [Forms](#forms)
* [Working with React](#working-with-react)
* [App structure](#app-structure)
* [Naming files](#naming-files)
@@ -30,6 +31,10 @@ Have questions about this document or anything not covered here? Feel free to re
- All code submissions are done through pull requests against the `devel` branch.
- If collaborating with someone else on the same branch, please use `--force-with-lease` instead of `--force` when pushing up code. This will prevent you from accidentally overwriting commits pushed by someone else. For more information, see https://git-scm.com/docs/git-push#git-push---force-with-leaseltrefnamegt
- We use a [code formatter](https://prettier.io/). Before adding a new commit or opening a PR, please apply the formatter using `npm run prettier`
- We adopt the following code style guide:
- functions should adopt camelCase
- constructors/classes should adopt PascalCase
- constants to be exported should adopt UPPERCASE
## Setting up your development environment
@@ -76,7 +81,7 @@ Note that mixins can be chained. See the example below.
Example of a model using multiple mixins:
```
```javascript
import NotificationsMixin from '../mixins/Notifications.mixin';
import InstanceGroupsMixin from '../mixins/InstanceGroups.mixin';
@@ -91,7 +96,7 @@ export default Organizations;
Example of mocking a specific method for every test in a suite:
```
```javascript
import { OrganizationsAPI } from '../../../../src/api';
// Mocks out all available methods. Comparable to:
@@ -124,6 +129,9 @@ API requests can and will fail occasionally so they should include explicit erro
- **other errors** - Most errors will fall into the first two categories, but for miscellaneous actions like toggling notifications, deleting a list item, etc. we display an alert modal to notify the user that their requested action couldn't be performed.
## Forms
Our forms should have a known, consistent, and fully-resolved starting state before it is possible for a user, keyboard-mouse, screen reader, or automated test to interact with them. If multiple network calls are needed to populate a form, resolve them all before displaying the form or showing a content error. When multiple requests are needed to create or update the resources represented by a form, resolve them all before transitioning the ui to a success or failure state.
## Working with React
### App structure
@@ -164,7 +172,7 @@ Ideally, files should be named the same as the component they export, and tests
**File naming** - Since contexts export both consumer and provider (and potentially in withContext function form), the file can be simplified to be named after the consumer export. In other words, the file containing the `Network` context components would be named `Network.jsx`.
**Component naming and conventions** - In order to provide a consistent interface with react-router and lingui, as well as make their usage easier and less verbose, context components follow these conventions:
**Component naming and conventions** - In order to provide a consistent interface with react-router and [lingui](https://lingui.js.org/), as well as make their usage easier and less verbose, context components follow these conventions:
- Providers are wrapped in a component in the `FooProvider` format.
- The value prop of the provider should be pulled from state. This is recommended by the react docs, [here](https://reactjs.org/docs/context.html#caveats).
- The provider should also be able to accept its value by prop for testing.
@@ -262,7 +270,7 @@ We have several React contexts that wrap much of the app, including those from r
If you want to stub the value of a context, or assert actions taken on it, you can customize a contexts value by passing a second parameter to `mountWithContexts`. For example, this provides a custom value for the `Config` context:
```
```javascript
const config = {
custom_virtualenvs: ['foo', 'bar'],
};
@@ -301,7 +309,7 @@ The lingui library provides various React helpers for dealing with both marking
**Note:** We try to avoid the `I18n` consumer, `i18nMark` function, or `<Trans>` component lingui gives us access to in this repo. i18nMark does not actually replace the string in the UI (leading to the potential for untranslated bugs), and the other helpers are redundant. Settling on a consistent, single pattern helps us ease the mental overhead of the need to understand the ins and outs of the lingui API.
You can learn more about the ways lingui and its React helpers at [this link](https://lingui.js.org/tutorials/react-patterns.html).
You can learn more about the ways lingui and its React helpers at [this link](https://lingui.js.org/tutorials/react-patterns.html).
### Setting up .po files to give to translation team

View File

@@ -11,7 +11,7 @@
* npm start
* visit `https://127.0.0.1:3001/`
**note:** These instructions assume you have the [awx](https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md#running-the-environment) development api server up and running at `localhost:8043`. You can use a different backend server with the `TAGET_HOST` and `TARGET_PORT` environment variables when starting the development server:
**note:** These instructions assume you have the [awx](https://github.com/ansible/awx/blob/devel/CONTRIBUTING.md#running-the-environment) development api server up and running at `localhost:8043`. You can use a different backend server with the `TARGET_HOST` and `TARGET_PORT` environment variables when starting the development server:
```shell
# use a non-default host and port when starting the development server

View File

@@ -2,7 +2,7 @@
## UX Considerations
Historically, the code that powers search in the AngularJS version of the AWX/Tower UI is very complex and prone to bugs. In order to reduce that complexity, we've made some UX desicions to help make the code easier to maintain.
Historically, the code that powers search in the AngularJS version of the AWX/Tower UI is very complex and prone to bugs. In order to reduce that complexity, we've made some UX decisions to help make the code easier to maintain.
**ALL query params namespaced and in url bar**

View File

@@ -4910,6 +4910,201 @@
}
}
},
"@jest/transform": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/@jest/transform/-/transform-25.1.0.tgz",
"integrity": "sha512-4ktrQ2TPREVeM+KxB4zskAT84SnmG1vaz4S+51aTefyqn3zocZUnliLLm5Fsl85I3p/kFPN4CRp1RElIfXGegQ==",
"dev": true,
"requires": {
"@babel/core": "^7.1.0",
"@jest/types": "^25.1.0",
"babel-plugin-istanbul": "^6.0.0",
"chalk": "^3.0.0",
"convert-source-map": "^1.4.0",
"fast-json-stable-stringify": "^2.0.0",
"graceful-fs": "^4.2.3",
"jest-haste-map": "^25.1.0",
"jest-regex-util": "^25.1.0",
"jest-util": "^25.1.0",
"micromatch": "^4.0.2",
"pirates": "^4.0.1",
"realpath-native": "^1.1.0",
"slash": "^3.0.0",
"source-map": "^0.6.1",
"write-file-atomic": "^3.0.0"
},
"dependencies": {
"ansi-styles": {
"version": "4.2.1",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.2.1.tgz",
"integrity": "sha512-9VGjrMsG1vePxcSweQsN20KY/c4zN0h9fLjqAbwbPfahM3t+NL+M9HC8xeXG2I8pX5NoamTGNuomEUFI7fcUjA==",
"dev": true,
"requires": {
"@types/color-name": "^1.1.1",
"color-convert": "^2.0.1"
}
},
"braces": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz",
"integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==",
"dev": true,
"requires": {
"fill-range": "^7.0.1"
}
},
"chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"requires": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
}
},
"color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"requires": {
"color-name": "~1.1.4"
}
},
"color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"fill-range": {
"version": "7.0.1",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz",
"integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==",
"dev": true,
"requires": {
"to-regex-range": "^5.0.1"
}
},
"graceful-fs": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.3.tgz",
"integrity": "sha512-a30VEBm4PEdx1dRB7MFK7BejejvCvBronbLjht+sHuGYj8PHs7M/5Z+rt5lw551vZ7yfTCj4Vuyy3mSJytDWRQ==",
"dev": true
},
"has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true
},
"is-number": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
"integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
"dev": true
},
"jest-regex-util": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-regex-util/-/jest-regex-util-25.1.0.tgz",
"integrity": "sha512-9lShaDmDpqwg+xAd73zHydKrBbbrIi08Kk9YryBEBybQFg/lBWR/2BDjjiSE7KIppM9C5+c03XiDaZ+m4Pgs1w==",
"dev": true
},
"micromatch": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.2.tgz",
"integrity": "sha512-y7FpHSbMUMoyPbYUSzO6PaZ6FyRnQOpHuKwbo1G+Knck95XVU4QAiKdGEnj5wwoS7PlOgthX/09u5iFJ+aYf5Q==",
"dev": true,
"requires": {
"braces": "^3.0.1",
"picomatch": "^2.0.5"
}
},
"supports-color": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.1.0.tgz",
"integrity": "sha512-oRSIpR8pxT1Wr2FquTNnGet79b3BWljqOuoW/h4oBhxJ/HUbX5nX6JSruTkvXDCFMwDPvsaTTbvMLKZWSy0R5g==",
"dev": true,
"requires": {
"has-flag": "^4.0.0"
}
},
"to-regex-range": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
"integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
"dev": true,
"requires": {
"is-number": "^7.0.0"
}
}
}
},
"@jest/types": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/@jest/types/-/types-25.1.0.tgz",
"integrity": "sha512-VpOtt7tCrgvamWZh1reVsGADujKigBUFTi19mlRjqEGsE8qH4r3s+skY33dNdXOwyZIvuftZ5tqdF1IgsMejMA==",
"dev": true,
"requires": {
"@types/istanbul-lib-coverage": "^2.0.0",
"@types/istanbul-reports": "^1.1.1",
"@types/yargs": "^15.0.0",
"chalk": "^3.0.0"
},
"dependencies": {
"ansi-styles": {
"version": "4.2.1",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.2.1.tgz",
"integrity": "sha512-9VGjrMsG1vePxcSweQsN20KY/c4zN0h9fLjqAbwbPfahM3t+NL+M9HC8xeXG2I8pX5NoamTGNuomEUFI7fcUjA==",
"dev": true,
"requires": {
"@types/color-name": "^1.1.1",
"color-convert": "^2.0.1"
}
},
"chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"requires": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
}
},
"color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"requires": {
"color-name": "~1.1.4"
}
},
"color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true
},
"supports-color": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.1.0.tgz",
"integrity": "sha512-oRSIpR8pxT1Wr2FquTNnGet79b3BWljqOuoW/h4oBhxJ/HUbX5nX6JSruTkvXDCFMwDPvsaTTbvMLKZWSy0R5g==",
"dev": true,
"requires": {
"has-flag": "^4.0.0"
}
}
}
},
"@lingui/babel-plugin-extract-messages": {
"version": "2.7.4",
"resolved": "https://registry.npmjs.org/@lingui/babel-plugin-extract-messages/-/babel-plugin-extract-messages-2.7.4.tgz",
@@ -5096,51 +5291,50 @@
"dev": true
},
"@patternfly/patternfly": {
"version": "2.66.0",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.66.0.tgz",
"integrity": "sha512-fZMr2q9LZhVtKAEcDJ4rzcCGC6iN93mEQPoLlv2T9td5Hba1bLw8Bpgp5fdTm95Fv/++AY0PsdUPZUzh1cx7Sg=="
"version": "2.71.3",
"resolved": "https://registry.npmjs.org/@patternfly/patternfly/-/patternfly-2.71.3.tgz",
"integrity": "sha512-uTb9zAtPjTKB8aHmWdavEOrSMs+NL9XovMvWYL9R74zXbGnEMHEpibn7cNSu469u2JrxY6VsH7x44aOfdZpqpg=="
},
"@patternfly/react-core": {
"version": "3.140.11",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.140.11.tgz",
"integrity": "sha512-841DeN5BTuUS02JfVXAAVJYtWY0HWc4ewqMD32Xog2MAR/pn74jzjnQOSQr4LUyVrH5QufB68SK4Alm2+IUzSw==",
"version": "3.153.3",
"resolved": "https://registry.npmjs.org/@patternfly/react-core/-/react-core-3.153.3.tgz",
"integrity": "sha512-2ccnn/HPfEhZfj9gfKZJpWgzOA9O6QeCHjZGh41tx7Lz7iZGl9b/UdTmDsQUeYYuJ+0M8fxhYnQMKaDxfcqyOQ==",
"requires": {
"@patternfly/react-icons": "^3.15.3",
"@patternfly/react-styles": "^3.7.4",
"@patternfly/react-tokens": "^2.8.4",
"emotion": "^9.2.9",
"exenv": "^1.2.2",
"focus-trap-react": "^4.0.1",
"@patternfly/react-icons": "^3.15.15",
"@patternfly/react-styles": "^3.7.12",
"@patternfly/react-tokens": "^2.8.12",
"focus-trap": "4.0.2",
"react-dropzone": "9.0.0",
"tippy.js": "5.1.2"
},
"dependencies": {
"@patternfly/react-icons": {
"version": "3.15.4",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.15.4.tgz",
"integrity": "sha512-tOVirISoZDIn0bWYFctGN9B7Q8wQ19FaK4XIUD2sgIDRBzDbe9JWuqdef7ogJFF78eQnZNsWOci6nhvVCVF/zA==",
"version": "3.15.15",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.15.15.tgz",
"integrity": "sha512-oYOgY7fELe3gKbKB2KRUANpYPWkKkEGpmKdmXonNmNUlg0t/a8V68raVX8bTjXN9pwKsUKqNQW1R+xFibtt0Aw==",
"requires": {
"@fortawesome/free-brands-svg-icons": "^5.8.1"
}
},
"@patternfly/react-tokens": {
"version": "2.8.4",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.8.4.tgz",
"integrity": "sha512-GlLyutls0bG39Nwl/sv2FUkicwyRNrXQFso+e7Y4470+VOUtSsVSdQz+rTjgPxQ38olKPsSZdtEjqN9o2PbDiw=="
"version": "2.8.12",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.8.12.tgz",
"integrity": "sha512-QyuMaTizuSn9eESl6bcopGKKgFydocc/N8T7OGB6jARBt6gdIoQWcztdBabSIVz/YGoEDw6lKeoNfed8p6GynA=="
}
}
},
"@patternfly/react-icons": {
"version": "3.15.4",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.15.4.tgz",
"integrity": "sha512-tOVirISoZDIn0bWYFctGN9B7Q8wQ19FaK4XIUD2sgIDRBzDbe9JWuqdef7ogJFF78eQnZNsWOci6nhvVCVF/zA==",
"version": "3.15.15",
"resolved": "https://registry.npmjs.org/@patternfly/react-icons/-/react-icons-3.15.15.tgz",
"integrity": "sha512-oYOgY7fELe3gKbKB2KRUANpYPWkKkEGpmKdmXonNmNUlg0t/a8V68raVX8bTjXN9pwKsUKqNQW1R+xFibtt0Aw==",
"requires": {
"@fortawesome/free-brands-svg-icons": "^5.8.1"
}
},
"@patternfly/react-styles": {
"version": "3.7.4",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.7.4.tgz",
"integrity": "sha512-D+wu0OIfWVgxWNShQhTK9cadw+KdMCoBYR8gbWjV9Q1aCsCEV/aL/x1nMyyaUQ3c2dqizHhujDG4z9jUZCmCcw==",
"version": "3.7.12",
"resolved": "https://registry.npmjs.org/@patternfly/react-styles/-/react-styles-3.7.12.tgz",
"integrity": "sha512-vTKyC78oKlrS6VTQ3GPYevc17qgxj2Ono+SCDwoMyhUexPEyXRuZHLoZA1/MkJHvSCqJHGBageBAFcRq5wb0XQ==",
"requires": {
"camel-case": "^3.0.0",
"css": "^2.2.3",
@@ -5150,9 +5344,9 @@
}
},
"@patternfly/react-tokens": {
"version": "2.8.4",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.8.4.tgz",
"integrity": "sha512-GlLyutls0bG39Nwl/sv2FUkicwyRNrXQFso+e7Y4470+VOUtSsVSdQz+rTjgPxQ38olKPsSZdtEjqN9o2PbDiw=="
"version": "2.8.12",
"resolved": "https://registry.npmjs.org/@patternfly/react-tokens/-/react-tokens-2.8.12.tgz",
"integrity": "sha512-QyuMaTizuSn9eESl6bcopGKKgFydocc/N8T7OGB6jARBt6gdIoQWcztdBabSIVz/YGoEDw6lKeoNfed8p6GynA=="
},
"@sinonjs/commons": {
"version": "1.7.1",
@@ -5260,6 +5454,15 @@
"integrity": "sha512-l42BggppR6zLmpfU6fq9HEa2oGPEI8yrSPL3GITjfRInppYFahObbIQOQK3UGxEnyQpltZLaPe75046NOZQikw==",
"dev": true
},
"@types/yargs": {
"version": "15.0.4",
"resolved": "https://registry.npmjs.org/@types/yargs/-/yargs-15.0.4.tgz",
"integrity": "sha512-9T1auFmbPZoxHz0enUFlUuKRy3it01R+hlggyVUMtnCTQRunsQYifnSGb8hET4Xo8yiC0o0r1paW3ud5+rbURg==",
"dev": true,
"requires": {
"@types/yargs-parser": "*"
}
},
"@types/yargs-parser": {
"version": "15.0.0",
"resolved": "https://registry.npmjs.org/@types/yargs-parser/-/yargs-parser-15.0.0.tgz",
@@ -6210,6 +6413,14 @@
"resolved": "https://registry.npmjs.org/atob/-/atob-2.1.2.tgz",
"integrity": "sha512-Wm6ukoaOGJi/73p/cl2GvLjTI5JM1k/O14isD73YML8StrH/7/lRFgmg8nICZgD3bZZvjwCGxtMOD3wWNAu8cg=="
},
"attr-accept": {
"version": "1.1.3",
"resolved": "https://registry.npmjs.org/attr-accept/-/attr-accept-1.1.3.tgz",
"integrity": "sha512-iT40nudw8zmCweivz6j58g+RT33I4KbaIvRUhjNmDwO2WmsQUxFEZZYZ5w3vXe5x5MX9D7mfvA/XaLOZYFR9EQ==",
"requires": {
"core-js": "^2.5.0"
}
},
"aws-sign2": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/aws-sign2/-/aws-sign2-0.7.0.tgz",
@@ -6330,6 +6541,73 @@
}
}
},
"babel-jest": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/babel-jest/-/babel-jest-25.1.0.tgz",
"integrity": "sha512-tz0VxUhhOE2y+g8R2oFrO/2VtVjA1lkJeavlhExuRBg3LdNJY9gwQ+Vcvqt9+cqy71MCTJhewvTB7Qtnnr9SWg==",
"dev": true,
"requires": {
"@jest/transform": "^25.1.0",
"@jest/types": "^25.1.0",
"@types/babel__core": "^7.1.0",
"babel-plugin-istanbul": "^6.0.0",
"babel-preset-jest": "^25.1.0",
"chalk": "^3.0.0",
"slash": "^3.0.0"
},
"dependencies": {
"ansi-styles": {
"version": "4.2.1",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.2.1.tgz",
"integrity": "sha512-9VGjrMsG1vePxcSweQsN20KY/c4zN0h9fLjqAbwbPfahM3t+NL+M9HC8xeXG2I8pX5NoamTGNuomEUFI7fcUjA==",
"dev": true,
"requires": {
"@types/color-name": "^1.1.1",
"color-convert": "^2.0.1"
}
},
"chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"requires": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
}
},
"color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"requires": {
"color-name": "~1.1.4"
}
},
"color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true
},
"supports-color": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.1.0.tgz",
"integrity": "sha512-oRSIpR8pxT1Wr2FquTNnGet79b3BWljqOuoW/h4oBhxJ/HUbX5nX6JSruTkvXDCFMwDPvsaTTbvMLKZWSy0R5g==",
"dev": true,
"requires": {
"has-flag": "^4.0.0"
}
}
}
},
"babel-loader": {
"version": "8.0.6",
"resolved": "https://registry.npmjs.org/babel-loader/-/babel-loader-8.0.6.tgz",
@@ -6407,6 +6685,15 @@
"test-exclude": "^6.0.0"
}
},
"babel-plugin-jest-hoist": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/babel-plugin-jest-hoist/-/babel-plugin-jest-hoist-25.1.0.tgz",
"integrity": "sha512-oIsopO41vW4YFZ9yNYoLQATnnN46lp+MZ6H4VvPKFkcc2/fkl3CfE/NZZSmnEIEsJRmJAgkVEK0R7Zbl50CpTw==",
"dev": true,
"requires": {
"@types/babel__traverse": "^7.0.6"
}
},
"babel-plugin-macros": {
"version": "2.4.2",
"resolved": "https://registry.npmjs.org/babel-plugin-macros/-/babel-plugin-macros-2.4.2.tgz",
@@ -6451,6 +6738,17 @@
}
}
},
"babel-preset-jest": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/babel-preset-jest/-/babel-preset-jest-25.1.0.tgz",
"integrity": "sha512-eCGn64olaqwUMaugXsTtGAM2I0QTahjEtnRu0ql8Ie+gDWAc1N6wqN0k2NilnyTunM69Pad7gJY7LOtwLimoFQ==",
"dev": true,
"requires": {
"@babel/plugin-syntax-bigint": "^7.0.0",
"@babel/plugin-syntax-object-rest-spread": "^7.0.0",
"babel-plugin-jest-hoist": "^25.1.0"
}
},
"babel-runtime": {
"version": "6.26.0",
"resolved": "https://registry.npmjs.org/babel-runtime/-/babel-runtime-6.26.0.tgz",
@@ -7898,9 +8196,9 @@
"dev": true
},
"cssom": {
"version": "0.3.4",
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.4.tgz",
"integrity": "sha512-+7prCSORpXNeR4/fUP3rL+TzqtiFfhMvTd7uEqMdgPvLPt4+uzFUeufx5RHjGTACCargg/DiEt/moMQmvnfkog=="
"version": "0.3.8",
"resolved": "https://registry.npmjs.org/cssom/-/cssom-0.3.8.tgz",
"integrity": "sha512-b0tGHbfegbhPJpxpiBPU2sCkigAqtM9O121le6bbOlgyV+NyGyCmVfJ6QW9eRjz8CpNfWEOYBIMIGRYkLwsIYg=="
},
"cssstyle": {
"version": "0.3.1",
@@ -7911,9 +8209,9 @@
}
},
"csstype": {
"version": "2.6.9",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.9.tgz",
"integrity": "sha512-xz39Sb4+OaTsULgUERcCk+TJj8ylkL4aSVDQiX/ksxbELSqwkgt4d4RD7fovIdgJGSuNYqwZEiVjYY5l0ask+Q=="
"version": "2.6.10",
"resolved": "https://registry.npmjs.org/csstype/-/csstype-2.6.10.tgz",
"integrity": "sha512-D34BqZU4cIlMCY93rZHbrq9pjTAQJ3U8S8rfBqjwHxkGPThWFjzZDQpgMJY0QViLxth6ZKYiwFBo14RdN44U/w=="
},
"currently-unhandled": {
"version": "0.4.1",
@@ -9751,11 +10049,6 @@
}
}
},
"exenv": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/exenv/-/exenv-1.2.2.tgz",
"integrity": "sha1-KueOhdmJQVhnCwPUe+wfA72Ru50="
},
"exit": {
"version": "0.1.2",
"resolved": "https://registry.npmjs.org/exit/-/exit-0.1.2.tgz",
@@ -10280,6 +10573,14 @@
"schema-utils": "^1.0.0"
}
},
"file-selector": {
"version": "0.1.12",
"resolved": "https://registry.npmjs.org/file-selector/-/file-selector-0.1.12.tgz",
"integrity": "sha512-Kx7RTzxyQipHuiqyZGf+Nz4vY9R1XGxuQl/hLoJwq+J4avk/9wxxgZyHKtbyIPJmbD4A66DWGYfyykWNpcYutQ==",
"requires": {
"tslib": "^1.9.0"
}
},
"fill-range": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-4.0.0.tgz",
@@ -10414,11 +10715,11 @@
}
},
"focus-trap": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-3.0.0.tgz",
"integrity": "sha512-jTFblf0tLWbleGjj2JZsAKbgtZTdL1uC48L8FcmSDl4c2vDoU4NycN1kgV5vJhuq1mxNFkw7uWZ1JAGlINWvyw==",
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/focus-trap/-/focus-trap-4.0.2.tgz",
"integrity": "sha512-HtLjfAK7Hp2qbBtLS6wEznID1mPT+48ZnP2nkHzgjpL4kroYHg0CdqJ5cTXk+UO5znAxF5fRUkhdyfgrhh8Lzw==",
"requires": {
"tabbable": "^3.1.0",
"tabbable": "^3.1.2",
"xtend": "^4.0.1"
},
"dependencies": {
@@ -10429,14 +10730,6 @@
}
}
},
"focus-trap-react": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/focus-trap-react/-/focus-trap-react-4.0.1.tgz",
"integrity": "sha512-UUZKVEn5cFbF6yUnW7lbXNW0iqN617ShSqYKgxctUvWw1wuylLtyVmC0RmPQNnJ/U+zoKc/djb0tZMs0uN/0QQ==",
"requires": {
"focus-trap": "^3.0.0"
}
},
"follow-redirects": {
"version": "1.5.9",
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.5.9.tgz",
@@ -10625,8 +10918,7 @@
"ansi-regex": {
"version": "2.1.1",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"aproba": {
"version": "1.2.0",
@@ -10647,14 +10939,12 @@
"balanced-match": {
"version": "1.0.0",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"brace-expansion": {
"version": "1.1.11",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"balanced-match": "^1.0.0",
"concat-map": "0.0.1"
@@ -10669,20 +10959,17 @@
"code-point-at": {
"version": "1.1.0",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"concat-map": {
"version": "0.0.1",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"console-control-strings": {
"version": "1.1.0",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"core-util-is": {
"version": "1.0.2",
@@ -10799,8 +11086,7 @@
"inherits": {
"version": "2.0.3",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"ini": {
"version": "1.3.5",
@@ -10812,7 +11098,6 @@
"version": "1.0.0",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"number-is-nan": "^1.0.0"
}
@@ -10827,7 +11112,6 @@
"version": "3.0.4",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"brace-expansion": "^1.1.7"
}
@@ -10835,14 +11119,12 @@
"minimist": {
"version": "0.0.8",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"minipass": {
"version": "2.3.5",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"safe-buffer": "^5.1.2",
"yallist": "^3.0.0"
@@ -10861,7 +11143,6 @@
"version": "0.5.1",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"minimist": "0.0.8"
}
@@ -10942,8 +11223,7 @@
"number-is-nan": {
"version": "1.0.1",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"object-assign": {
"version": "4.1.1",
@@ -10955,7 +11235,6 @@
"version": "1.4.0",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"wrappy": "1"
}
@@ -11041,8 +11320,7 @@
"safe-buffer": {
"version": "5.1.2",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"safer-buffer": {
"version": "2.1.2",
@@ -11078,7 +11356,6 @@
"version": "1.0.2",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"code-point-at": "^1.0.0",
"is-fullwidth-code-point": "^1.0.0",
@@ -11098,7 +11375,6 @@
"version": "3.0.1",
"bundled": true,
"dev": true,
"optional": true,
"requires": {
"ansi-regex": "^2.0.0"
}
@@ -11142,14 +11418,12 @@
"wrappy": {
"version": "1.0.2",
"bundled": true,
"dev": true,
"optional": true
"dev": true
},
"yallist": {
"version": "3.0.3",
"bundled": true,
"dev": true,
"optional": true
"dev": true
}
}
},
@@ -11629,15 +11903,22 @@
"dev": true
},
"html-tokenize": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/html-tokenize/-/html-tokenize-2.0.0.tgz",
"integrity": "sha1-izqaXetHXK5qb5ZxYA0sIKspglE=",
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/html-tokenize/-/html-tokenize-2.0.1.tgz",
"integrity": "sha512-QY6S+hZ0f5m1WT8WffYN+Hg+xm/w5I8XeUcAq/ZYP5wVC8xbKi4Whhru3FtrAebD5EhBW8rmFzkDI6eCAuFe2w==",
"requires": {
"buffer-from": "~0.1.1",
"inherits": "~2.0.1",
"minimist": "~0.0.8",
"minimist": "~1.2.5",
"readable-stream": "~1.0.27-1",
"through2": "~0.4.1"
},
"dependencies": {
"minimist": {
"version": "1.2.5",
"resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz",
"integrity": "sha512-FM9nNUYrRBAELZQT3xeZQ7fmMOBg6nWNmJKTcgsJeaLstP/UODVpGsr5OhXhhXg6f+qtJ8uiZ+PUxkDWcgIXLw=="
}
}
},
"htmlparser2": {
@@ -14622,6 +14903,99 @@
"integrity": "sha512-/jsz0Y+V29w1chdXVygEKSz2nBoHoYqNShPe+QgxSNjAuP1i8+k4LbQNrfoliKej0P45sivkSCh7yiD6ubHS3w==",
"dev": true
},
"jest-haste-map": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-haste-map/-/jest-haste-map-25.1.0.tgz",
"integrity": "sha512-/2oYINIdnQZAqyWSn1GTku571aAfs8NxzSErGek65Iu5o8JYb+113bZysRMcC/pjE5v9w0Yz+ldbj9NxrFyPyw==",
"dev": true,
"requires": {
"@jest/types": "^25.1.0",
"anymatch": "^3.0.3",
"fb-watchman": "^2.0.0",
"fsevents": "^2.1.2",
"graceful-fs": "^4.2.3",
"jest-serializer": "^25.1.0",
"jest-util": "^25.1.0",
"jest-worker": "^25.1.0",
"micromatch": "^4.0.2",
"sane": "^4.0.3",
"walker": "^1.0.7"
},
"dependencies": {
"anymatch": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/anymatch/-/anymatch-3.1.1.tgz",
"integrity": "sha512-mM8522psRCqzV+6LhomX5wgp25YVibjh8Wj23I5RPkPppSVSjyKD2A2mBJmWGa+KN7f2D6LNh9jkBCeyLktzjg==",
"dev": true,
"requires": {
"normalize-path": "^3.0.0",
"picomatch": "^2.0.4"
}
},
"braces": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/braces/-/braces-3.0.2.tgz",
"integrity": "sha512-b8um+L1RzM3WDSzvhm6gIz1yfTbBt6YTlcEKAvsmqCZZFw46z626lVj9j1yEPW33H5H+lBQpZMP1k8l+78Ha0A==",
"dev": true,
"requires": {
"fill-range": "^7.0.1"
}
},
"fill-range": {
"version": "7.0.1",
"resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz",
"integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==",
"dev": true,
"requires": {
"to-regex-range": "^5.0.1"
}
},
"fsevents": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.1.2.tgz",
"integrity": "sha512-R4wDiBwZ0KzpgOWetKDug1FZcYhqYnUYKtfZYt4mD5SBz76q0KR4Q9o7GIPamsVPGmW3EYPPJ0dOOjvx32ldZA==",
"dev": true,
"optional": true
},
"graceful-fs": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.3.tgz",
"integrity": "sha512-a30VEBm4PEdx1dRB7MFK7BejejvCvBronbLjht+sHuGYj8PHs7M/5Z+rt5lw551vZ7yfTCj4Vuyy3mSJytDWRQ==",
"dev": true
},
"is-number": {
"version": "7.0.0",
"resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz",
"integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==",
"dev": true
},
"micromatch": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/micromatch/-/micromatch-4.0.2.tgz",
"integrity": "sha512-y7FpHSbMUMoyPbYUSzO6PaZ6FyRnQOpHuKwbo1G+Knck95XVU4QAiKdGEnj5wwoS7PlOgthX/09u5iFJ+aYf5Q==",
"dev": true,
"requires": {
"braces": "^3.0.1",
"picomatch": "^2.0.5"
}
},
"normalize-path": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz",
"integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==",
"dev": true
},
"to-regex-range": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
"integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==",
"dev": true,
"requires": {
"is-number": "^7.0.0"
}
}
}
},
"jest-jasmine2": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-jasmine2/-/jest-jasmine2-25.1.0.tgz",
@@ -15941,6 +16315,12 @@
}
}
},
"jest-serializer": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-serializer/-/jest-serializer-25.1.0.tgz",
"integrity": "sha512-20Wkq5j7o84kssBwvyuJ7Xhn7hdPeTXndnwIblKDR2/sy1SUm6rWWiG9kSCgJPIfkDScJCIsTtOKdlzfIHOfKA==",
"dev": true
},
"jest-snapshot": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-snapshot/-/jest-snapshot-25.1.0.tgz",
@@ -16239,6 +16619,70 @@
}
}
},
"jest-util": {
"version": "25.1.0",
"resolved": "https://registry.npmjs.org/jest-util/-/jest-util-25.1.0.tgz",
"integrity": "sha512-7did6pLQ++87Qsj26Fs/TIwZMUFBXQ+4XXSodRNy3luch2DnRXsSnmpVtxxQ0Yd6WTipGpbhh2IFP1mq6/fQGw==",
"dev": true,
"requires": {
"@jest/types": "^25.1.0",
"chalk": "^3.0.0",
"is-ci": "^2.0.0",
"mkdirp": "^0.5.1"
},
"dependencies": {
"ansi-styles": {
"version": "4.2.1",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.2.1.tgz",
"integrity": "sha512-9VGjrMsG1vePxcSweQsN20KY/c4zN0h9fLjqAbwbPfahM3t+NL+M9HC8xeXG2I8pX5NoamTGNuomEUFI7fcUjA==",
"dev": true,
"requires": {
"@types/color-name": "^1.1.1",
"color-convert": "^2.0.1"
}
},
"chalk": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-3.0.0.tgz",
"integrity": "sha512-4D3B6Wf41KOYRFdszmDqMCGq5VV/uMAB273JILmO+3jAlh8X4qDtdtgCR3fxtbLEMzSx22QdhnDcJvu2u1fVwg==",
"dev": true,
"requires": {
"ansi-styles": "^4.1.0",
"supports-color": "^7.1.0"
}
},
"color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
"dev": true,
"requires": {
"color-name": "~1.1.4"
}
},
"color-name": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
"dev": true
},
"has-flag": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-4.0.0.tgz",
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true
},
"supports-color": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.1.0.tgz",
"integrity": "sha512-oRSIpR8pxT1Wr2FquTNnGet79b3BWljqOuoW/h4oBhxJ/HUbX5nX6JSruTkvXDCFMwDPvsaTTbvMLKZWSy0R5g==",
"dev": true,
"requires": {
"has-flag": "^4.0.0"
}
}
}
},
"jest-validate": {
"version": "23.6.0",
"resolved": "https://registry.npmjs.org/jest-validate/-/jest-validate-23.6.0.tgz",
@@ -18927,6 +19371,15 @@
"reflect.ownkeys": "^0.2.0"
}
},
"prop-types-extra": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/prop-types-extra/-/prop-types-extra-1.1.1.tgz",
"integrity": "sha512-59+AHNnHYCdiC+vMwY52WmvP5dM3QLeoumYuEyceQDi9aEhtwN9zIQ2ZNo25sMyXnbh32h+P1ezDsUpUH3JAew==",
"requires": {
"react-is": "^16.3.2",
"warning": "^4.0.0"
}
},
"proxy-addr": {
"version": "2.0.4",
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.4.tgz",
@@ -19157,6 +19610,17 @@
}
}
},
"react-dropzone": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/react-dropzone/-/react-dropzone-9.0.0.tgz",
"integrity": "sha512-wZ2o9B2qkdE3RumWhfyZT9swgJYJPeU5qHEcMU8weYpmLex1eeWX0CC32/Y0VutB+BBi2D+iePV/YZIiB4kZGw==",
"requires": {
"attr-accept": "^1.1.3",
"file-selector": "^0.1.8",
"prop-types": "^15.6.2",
"prop-types-extra": "^1.1.0"
}
},
"react-fast-compare": {
"version": "2.0.4",
"resolved": "https://registry.npmjs.org/react-fast-compare/-/react-fast-compare-2.0.4.tgz",
@@ -21817,8 +22281,7 @@
"tslib": {
"version": "1.9.3",
"resolved": "https://registry.npmjs.org/tslib/-/tslib-1.9.3.tgz",
"integrity": "sha512-4krF8scpejhaOgqzBEcGM7yDIEfi0/8+8zDRZhNZZ2kjmHJ4hv3zCbQWxoJGz1iw5U0Jl0nma13xzHXcncMavQ==",
"dev": true
"integrity": "sha512-4krF8scpejhaOgqzBEcGM7yDIEfi0/8+8zDRZhNZZ2kjmHJ4hv3zCbQWxoJGz1iw5U0Jl0nma13xzHXcncMavQ=="
},
"tty-browserify": {
"version": "0.0.0",
@@ -22274,6 +22737,14 @@
"makeerror": "1.0.x"
}
},
"warning": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/warning/-/warning-4.0.3.tgz",
"integrity": "sha512-rpJyN222KWIvHJ/F53XSZv0Zl/accqHR8et1kpaMTD/fLCRxtV8iX8czMzY7sVZupTI3zcUTg8eycS2kNF9l6w==",
"requires": {
"loose-envify": "^1.0.0"
}
},
"watchpack": {
"version": "1.6.0",
"resolved": "https://registry.npmjs.org/watchpack/-/watchpack-1.6.0.tgz",

View File

@@ -58,10 +58,10 @@
},
"dependencies": {
"@lingui/react": "^2.7.2",
"@patternfly/patternfly": "^2.66.0",
"@patternfly/react-core": "^3.140.11",
"@patternfly/react-icons": "^3.15.4",
"@patternfly/react-tokens": "^2.8.4",
"@patternfly/patternfly": "^2.71.3",
"@patternfly/react-core": "^3.153.3",
"@patternfly/react-icons": "^3.15.15",
"@patternfly/react-tokens": "^2.8.12",
"ansi-to-html": "^0.6.11",
"axios": "^0.18.1",
"codemirror": "^5.47.0",

View File

@@ -1,5 +1,9 @@
const SchedulesMixin = parent =>
class extends parent {
createSchedule(id, data) {
return this.http.post(`${this.baseUrl}${id}/schedules/`, data);
}
readSchedules(id, params) {
return this.http.get(`${this.baseUrl}${id}/schedules/`, { params });
}

View File

@@ -5,6 +5,28 @@ class CredentialTypes extends Base {
super(http);
this.baseUrl = '/api/v2/credential_types/';
}
async loadAllTypes(
acceptableKinds = ['machine', 'cloud', 'net', 'ssh', 'vault']
) {
const pageSize = 200;
// The number of credential types a user can have is unlimited. In practice, it is unlikely for
// users to have more than a page at the maximum request size.
const {
data: { next, results },
} = await this.read({ page_size: pageSize });
let nextResults = [];
if (next) {
const { data } = await this.read({
page_size: pageSize,
page: 2,
});
nextResults = data.results;
}
return results
.concat(nextResults)
.filter(type => acceptableKinds.includes(type.kind));
}
}
export default CredentialTypes;

View File

@@ -0,0 +1,65 @@
import CredentialTypes from './CredentialTypes';
const typesData = [{ id: 1, kind: 'machine' }, { id: 2, kind: 'cloud' }];
describe('CredentialTypesAPI', () => {
test('should load all types', async () => {
const getPromise = () =>
Promise.resolve({
data: {
results: typesData,
},
});
const mockHttp = { get: jest.fn(getPromise) };
const CredentialTypesAPI = new CredentialTypes(mockHttp);
const types = await CredentialTypesAPI.loadAllTypes();
expect(mockHttp.get).toHaveBeenCalledTimes(1);
expect(mockHttp.get.mock.calls[0]).toEqual([
`/api/v2/credential_types/`,
{ params: { page_size: 200 } },
]);
expect(types).toEqual(typesData);
});
test('should load all types (2 pages)', async () => {
const getPromise = () =>
Promise.resolve({
data: {
results: typesData,
next: 2,
},
});
const mockHttp = { get: jest.fn(getPromise) };
const CredentialTypesAPI = new CredentialTypes(mockHttp);
const types = await CredentialTypesAPI.loadAllTypes();
expect(mockHttp.get).toHaveBeenCalledTimes(2);
expect(mockHttp.get.mock.calls[0]).toEqual([
`/api/v2/credential_types/`,
{ params: { page_size: 200 } },
]);
expect(mockHttp.get.mock.calls[1]).toEqual([
`/api/v2/credential_types/`,
{ params: { page_size: 200, page: 2 } },
]);
expect(types).toHaveLength(4);
});
test('should filter by acceptable kinds', async () => {
const getPromise = () =>
Promise.resolve({
data: {
results: typesData,
},
});
const mockHttp = { get: jest.fn(getPromise) };
const CredentialTypesAPI = new CredentialTypes(mockHttp);
const types = await CredentialTypesAPI.loadAllTypes(['machine']);
expect(types).toEqual([typesData[0]]);
});
});

View File

@@ -4,11 +4,36 @@ class Hosts extends Base {
constructor(http) {
super(http);
this.baseUrl = '/api/v2/hosts/';
this.readFacts = this.readFacts.bind(this);
this.readAllGroups = this.readAllGroups.bind(this);
this.readGroupsOptions = this.readGroupsOptions.bind(this);
this.associateGroup = this.associateGroup.bind(this);
this.disassociateGroup = this.disassociateGroup.bind(this);
}
readFacts(id) {
return this.http.get(`${this.baseUrl}${id}/ansible_facts/`);
}
readAllGroups(id, params) {
return this.http.get(`${this.baseUrl}${id}/all_groups/`, { params });
}
readGroupsOptions(id) {
return this.http.options(`${this.baseUrl}${id}/groups/`);
}
associateGroup(id, groupId) {
return this.http.post(`${this.baseUrl}${id}/groups/`, { id: groupId });
}
disassociateGroup(id, group) {
return this.http.post(`${this.baseUrl}${id}/groups/`, {
id: group.id,
disassociate: true,
});
}
}
export default Hosts;

View File

@@ -75,7 +75,7 @@ class JobTemplates extends SchedulesMixin(
return this.http.get(`${this.baseUrl}${id}/survey_spec/`);
}
updateSurvey(id, survey = null) {
updateSurvey(id, survey) {
return this.http.post(`${this.baseUrl}${id}/survey_spec/`, survey);
}

View File

@@ -13,6 +13,10 @@ class Schedules extends Base {
readCredentials(resourceId, params) {
return this.http.get(`${this.baseUrl}${resourceId}/credentials/`, params);
}
readZoneInfo() {
return this.http.get(`${this.baseUrl}zoneinfo/`);
}
}
export default Schedules;

View File

@@ -49,7 +49,21 @@ class WorkflowJobTemplates extends SchedulesMixin(NotificationsMixin(Base)) {
}
readAccessList(id, params) {
return this.http.get(`${this.baseUrl}${id}/access_list/`, { params });
return this.http.get(`${this.baseUrl}${id}/access_list/`, {
params,
});
}
readSurvey(id) {
return this.http.get(`${this.baseUrl}${id}/survey_spec/`);
}
updateSurvey(id, survey) {
return this.http.post(`${this.baseUrl}${id}/survey_spec/`, survey);
}
destroySurvey(id) {
return this.http.delete(`${this.baseUrl}${id}/survey_spec/`);
}
}

View File

@@ -25,7 +25,16 @@ class AnsibleSelect extends React.Component {
}
render() {
const { id, data, i18n, isValid, onBlur, value, className } = this.props;
const {
id,
data,
i18n,
isValid,
onBlur,
value,
className,
isDisabled,
} = this.props;
return (
<FormSelect
@@ -36,6 +45,7 @@ class AnsibleSelect extends React.Component {
aria-label={i18n._(t`Select Input`)}
isValid={isValid}
className={className}
isDisabled={isDisabled}
>
{data.map(option => (
<FormSelectOption
@@ -62,6 +72,7 @@ AnsibleSelect.defaultProps = {
isValid: true,
onBlur: () => {},
className: '',
isDisabled: false,
};
AnsibleSelect.propTypes = {
@@ -72,6 +83,7 @@ AnsibleSelect.propTypes = {
onChange: func.isRequired,
value: oneOfType([string, number]).isRequired,
className: string,
isDisabled: bool,
};
export { AnsibleSelect as _AnsibleSelect };

View File

@@ -3,7 +3,7 @@ import { useHistory } from 'react-router-dom';
import { withI18n } from '@lingui/react';
import { t } from '@lingui/macro';
import { Button, Modal } from '@patternfly/react-core';
import OptionsList from '@components/Lookup/shared/OptionsList';
import OptionsList from '@components/OptionsList';
import useRequest from '@util/useRequest';
import { getQSConfig, parseQueryString } from '@util/qs';
import useSelected from '@util/useSelected';

View File

@@ -3,7 +3,7 @@ import { act } from 'react-dom/test-utils';
import { mountWithContexts, waitForElement } from '@testUtils/enzymeHelpers';
import AssociateModal from './AssociateModal';
import mockHosts from '../shared/data.hosts.json';
import mockHosts from './data.hosts.json';
jest.mock('@api');

View File

@@ -0,0 +1,393 @@
{
"count": 3,
"results": [
{
"id": 2,
"type": "host",
"url": "/api/v2/hosts/2/",
"related": {
"created_by": "/api/v2/users/10/",
"modified_by": "/api/v2/users/19/",
"variable_data": "/api/v2/hosts/2/variable_data/",
"groups": "/api/v2/hosts/2/groups/",
"all_groups": "/api/v2/hosts/2/all_groups/",
"job_events": "/api/v2/hosts/2/job_events/",
"job_host_summaries": "/api/v2/hosts/2/job_host_summaries/",
"activity_stream": "/api/v2/hosts/2/activity_stream/",
"inventory_sources": "/api/v2/hosts/2/inventory_sources/",
"smart_inventories": "/api/v2/hosts/2/smart_inventories/",
"ad_hoc_commands": "/api/v2/hosts/2/ad_hoc_commands/",
"ad_hoc_command_events": "/api/v2/hosts/2/ad_hoc_command_events/",
"insights": "/api/v2/hosts/2/insights/",
"ansible_facts": "/api/v2/hosts/2/ansible_facts/",
"inventory": "/api/v2/inventories/2/",
"last_job": "/api/v2/jobs/236/",
"last_job_host_summary": "/api/v2/job_host_summaries/2202/"
},
"summary_fields": {
"inventory": {
"id": 2,
"name": " Inventory 1 Org 0",
"description": "",
"has_active_failures": false,
"total_hosts": 33,
"hosts_with_active_failures": 0,
"total_groups": 4,
"has_inventory_sources": false,
"total_inventory_sources": 0,
"inventory_sources_with_failures": 0,
"organization_id": 2,
"kind": ""
},
"last_job": {
"id": 236,
"name": " Job Template 1 Project 0",
"description": "",
"finished": "2020-02-26T03:15:21.471439Z",
"status": "successful",
"failed": false,
"job_template_id": 18,
"job_template_name": " Job Template 1 Project 0"
},
"last_job_host_summary": {
"id": 2202,
"failed": false
},
"created_by": {
"id": 10,
"username": "user-3",
"first_name": "",
"last_name": ""
},
"modified_by": {
"id": 19,
"username": "all",
"first_name": "",
"last_name": ""
},
"user_capabilities": {
"edit": true,
"delete": true
},
"groups": {
"count": 2,
"results": [
{
"id": 1,
"name": " Group 1 Inventory 0"
},
{
"id": 2,
"name": " Group 2 Inventory 0"
}
]
},
"recent_jobs": [
{
"id": 236,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-26T03:15:21.471439Z"
},
{
"id": 232,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T21:20:33.593789Z"
},
{
"id": 229,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:19:46.364134Z"
},
{
"id": 228,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:18:54.138363Z"
},
{
"id": 225,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T15:55:32.247652Z"
}
]
},
"created": "2020-02-24T15:10:58.922179Z",
"modified": "2020-02-26T21:52:43.428530Z",
"name": ".host-000001.group-00000.dummy",
"description": "",
"inventory": 2,
"enabled": false,
"instance_id": "",
"variables": "",
"has_active_failures": false,
"has_inventory_sources": false,
"last_job": 236,
"last_job_host_summary": 2202,
"insights_system_id": null,
"ansible_facts_modified": null
},
{
"id": 3,
"type": "host",
"url": "/api/v2/hosts/3/",
"related": {
"created_by": "/api/v2/users/11/",
"modified_by": "/api/v2/users/1/",
"variable_data": "/api/v2/hosts/3/variable_data/",
"groups": "/api/v2/hosts/3/groups/",
"all_groups": "/api/v2/hosts/3/all_groups/",
"job_events": "/api/v2/hosts/3/job_events/",
"job_host_summaries": "/api/v2/hosts/3/job_host_summaries/",
"activity_stream": "/api/v2/hosts/3/activity_stream/",
"inventory_sources": "/api/v2/hosts/3/inventory_sources/",
"smart_inventories": "/api/v2/hosts/3/smart_inventories/",
"ad_hoc_commands": "/api/v2/hosts/3/ad_hoc_commands/",
"ad_hoc_command_events": "/api/v2/hosts/3/ad_hoc_command_events/",
"insights": "/api/v2/hosts/3/insights/",
"ansible_facts": "/api/v2/hosts/3/ansible_facts/",
"inventory": "/api/v2/inventories/2/",
"last_job": "/api/v2/jobs/236/",
"last_job_host_summary": "/api/v2/job_host_summaries/2195/"
},
"summary_fields": {
"inventory": {
"id": 2,
"name": " Inventory 1 Org 0",
"description": "",
"has_active_failures": false,
"total_hosts": 33,
"hosts_with_active_failures": 0,
"total_groups": 4,
"has_inventory_sources": false,
"total_inventory_sources": 0,
"inventory_sources_with_failures": 0,
"organization_id": 2,
"kind": ""
},
"last_job": {
"id": 236,
"name": " Job Template 1 Project 0",
"description": "",
"finished": "2020-02-26T03:15:21.471439Z",
"status": "successful",
"failed": false,
"job_template_id": 18,
"job_template_name": " Job Template 1 Project 0"
},
"last_job_host_summary": {
"id": 2195,
"failed": false
},
"created_by": {
"id": 11,
"username": "user-4",
"first_name": "",
"last_name": ""
},
"modified_by": {
"id": 1,
"username": "admin",
"first_name": "",
"last_name": ""
},
"user_capabilities": {
"edit": true,
"delete": true
},
"groups": {
"count": 2,
"results": [
{
"id": 1,
"name": " Group 1 Inventory 0"
},
{
"id": 2,
"name": " Group 2 Inventory 0"
}
]
},
"recent_jobs": [
{
"id": 236,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-26T03:15:21.471439Z"
},
{
"id": 232,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T21:20:33.593789Z"
},
{
"id": 229,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:19:46.364134Z"
},
{
"id": 228,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:18:54.138363Z"
},
{
"id": 225,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T15:55:32.247652Z"
}
]
},
"created": "2020-02-24T15:10:58.945113Z",
"modified": "2020-02-27T03:43:43.635871Z",
"name": ".host-000002.group-00000.dummy",
"description": "",
"inventory": 2,
"enabled": false,
"instance_id": "",
"variables": "",
"has_active_failures": false,
"has_inventory_sources": false,
"last_job": 236,
"last_job_host_summary": 2195,
"insights_system_id": null,
"ansible_facts_modified": null
},
{
"id": 4,
"type": "host",
"url": "/api/v2/hosts/4/",
"related": {
"created_by": "/api/v2/users/12/",
"modified_by": "/api/v2/users/1/",
"variable_data": "/api/v2/hosts/4/variable_data/",
"groups": "/api/v2/hosts/4/groups/",
"all_groups": "/api/v2/hosts/4/all_groups/",
"job_events": "/api/v2/hosts/4/job_events/",
"job_host_summaries": "/api/v2/hosts/4/job_host_summaries/",
"activity_stream": "/api/v2/hosts/4/activity_stream/",
"inventory_sources": "/api/v2/hosts/4/inventory_sources/",
"smart_inventories": "/api/v2/hosts/4/smart_inventories/",
"ad_hoc_commands": "/api/v2/hosts/4/ad_hoc_commands/",
"ad_hoc_command_events": "/api/v2/hosts/4/ad_hoc_command_events/",
"insights": "/api/v2/hosts/4/insights/",
"ansible_facts": "/api/v2/hosts/4/ansible_facts/",
"inventory": "/api/v2/inventories/2/",
"last_job": "/api/v2/jobs/236/",
"last_job_host_summary": "/api/v2/job_host_summaries/2192/"
},
"summary_fields": {
"inventory": {
"id": 2,
"name": " Inventory 1 Org 0",
"description": "",
"has_active_failures": false,
"total_hosts": 33,
"hosts_with_active_failures": 0,
"total_groups": 4,
"has_inventory_sources": false,
"total_inventory_sources": 0,
"inventory_sources_with_failures": 0,
"organization_id": 2,
"kind": ""
},
"last_job": {
"id": 236,
"name": " Job Template 1 Project 0",
"description": "",
"finished": "2020-02-26T03:15:21.471439Z",
"status": "successful",
"failed": false,
"job_template_id": 18,
"job_template_name": " Job Template 1 Project 0"
},
"last_job_host_summary": {
"id": 2192,
"failed": false
},
"created_by": {
"id": 12,
"username": "user-5",
"first_name": "",
"last_name": ""
},
"modified_by": {
"id": 1,
"username": "admin",
"first_name": "",
"last_name": ""
},
"user_capabilities": {
"edit": true,
"delete": true
},
"groups": {
"count": 2,
"results": [
{
"id": 1,
"name": " Group 1 Inventory 0"
},
{
"id": 2,
"name": " Group 2 Inventory 0"
}
]
},
"recent_jobs": [
{
"id": 236,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-26T03:15:21.471439Z"
},
{
"id": 232,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T21:20:33.593789Z"
},
{
"id": 229,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:19:46.364134Z"
},
{
"id": 228,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T16:18:54.138363Z"
},
{
"id": 225,
"name": " Job Template 1 Project 0",
"status": "successful",
"finished": "2020-02-25T15:55:32.247652Z"
}
]
},
"created": "2020-02-24T15:10:58.962312Z",
"modified": "2020-02-27T03:43:45.528882Z",
"name": ".host-000003.group-00000.dummy",
"description": "",
"inventory": 2,
"enabled": false,
"instance_id": "",
"variables": "",
"has_active_failures": false,
"has_inventory_sources": false,
"last_job": 236,
"last_job_host_summary": 2192,
"insights_system_id": null,
"ansible_facts_modified": null
}
]
}

View File

@@ -0,0 +1 @@
export { default } from './AssociateModal';

View File

@@ -7,7 +7,8 @@ import {
BreadcrumbItem,
BreadcrumbHeading,
} from '@patternfly/react-core';
import { Link, Route, withRouter } from 'react-router-dom';
import { Link, Route, useRouteMatch } from 'react-router-dom';
import styled from 'styled-components';
const PageSection = styled(PFPageSection)`
@@ -21,18 +22,16 @@ const Breadcrumbs = ({ breadcrumbConfig }) => {
return (
<PageSection variant={light}>
<Breadcrumb>
<Route
path="/:path"
render={props => (
<Crumb breadcrumbConfig={breadcrumbConfig} {...props} />
)}
/>
<Route path="/:path">
<Crumb breadcrumbConfig={breadcrumbConfig} />
</Route>
</Breadcrumb>
</PageSection>
);
};
const Crumb = ({ breadcrumbConfig, match }) => {
const Crumb = ({ breadcrumbConfig }) => {
const match = useRouteMatch();
const crumb = breadcrumbConfig[match.url];
let crumbElement = (
@@ -54,12 +53,9 @@ const Crumb = ({ breadcrumbConfig, match }) => {
return (
<Fragment>
{crumbElement}
<Route
path={`${match.url}/:path`}
render={props => (
<Crumb breadcrumbConfig={breadcrumbConfig} {...props} />
)}
/>
<Route path={`${match.url}/:path`}>
<Crumb breadcrumbConfig={breadcrumbConfig} />
</Route>
</Fragment>
);
};
@@ -72,4 +68,4 @@ Crumb.propTypes = {
breadcrumbConfig: PropTypes.objectOf(PropTypes.string).isRequired,
};
export default withRouter(Breadcrumbs);
export default Breadcrumbs;

View File

@@ -0,0 +1 @@
export { default } from './DisassociateButton';

View File

@@ -9,6 +9,7 @@ import {
CardBody as PFCardBody,
Expandable as PFExpandable,
} from '@patternfly/react-core';
import getErrorMessage from './getErrorMessage';
const Card = styled(PFCard)`
background-color: var(--pf-global--BackgroundColor--200);
@@ -52,14 +53,7 @@ class ErrorDetail extends Component {
renderNetworkError() {
const { error } = this.props;
const { response } = error;
let message = '';
if (response?.data) {
message =
typeof response.data === 'string'
? response.data
: response.data?.detail;
}
const message = getErrorMessage(response);
return (
<Fragment>
@@ -67,7 +61,17 @@ class ErrorDetail extends Component {
{response?.config?.method.toUpperCase()} {response?.config?.url}{' '}
<strong>{response?.status}</strong>
</CardBody>
<CardBody>{message}</CardBody>
<CardBody>
{Array.isArray(message) ? (
<ul>
{message.map(m => (
<li key={m}>{m}</li>
))}
</ul>
) : (
message
)}
</CardBody>
</Fragment>
);
}

View File

@@ -21,4 +21,25 @@ describe('ErrorDetail', () => {
);
expect(wrapper).toHaveLength(1);
});
test('testing errors', () => {
const wrapper = mountWithContexts(
<ErrorDetail
error={
new Error({
response: {
config: {
method: 'patch',
},
data: {
project: ['project error'],
inventory: ['inventory error'],
},
},
})
}
/>
);
wrapper.find('Expandable').prop('onToggle')();
wrapper.update();
});
});

View File

@@ -0,0 +1,15 @@
export default function getErrorMessage(response) {
if (!response.data) {
return null;
}
if (typeof response.data === 'string') {
return response.data;
}
if (response.data.detail) {
return response.data.detail;
}
return Object.values(response.data).reduce(
(acc, currentValue) => acc.concat(currentValue),
[]
);
}

View File

@@ -0,0 +1,60 @@
import getErrorMessage from './getErrorMessage';
describe('getErrorMessage', () => {
test('should return data string', () => {
const response = {
data: 'error response',
};
expect(getErrorMessage(response)).toEqual('error response');
});
test('should return detail string', () => {
const response = {
data: {
detail: 'detail string',
},
};
expect(getErrorMessage(response)).toEqual('detail string');
});
test('should return an array of strings', () => {
const response = {
data: {
project: ['project error response'],
},
};
expect(getErrorMessage(response)).toEqual(['project error response']);
});
test('should consolidate error messages from multiple keys into an array', () => {
const response = {
data: {
project: ['project error response'],
inventory: ['inventory error response'],
organization: ['org error response'],
},
};
expect(getErrorMessage(response)).toEqual([
'project error response',
'inventory error response',
'org error response',
]);
});
test('should handle no response.data', () => {
const response = {};
expect(getErrorMessage(response)).toEqual(null);
});
test('should consolidate multiple error messages from multiple keys into an array', () => {
const response = {
data: {
project: ['project error response'],
inventory: [
'inventory error response',
'another inventory error response',
],
},
};
expect(getErrorMessage(response)).toEqual([
'project error response',
'inventory error response',
'another inventory error response',
]);
});
});

Some files were not shown because too many files have changed in this diff Show More