Logreduce

CountFilenameCompared too
3logs/quickstart_install.txt.gz (link)logs/quickstart_install.txt.gz logs/quickstart_install.txt.gz logs/quickstart_install.txt.gz logs/quickstart_install.txt.gz logs/quickstart_install.txt.gz
1logs/subnode-2/var/log/pacemaker/bundles/rabbitmq-bundle-0/pacemaker.log.txt.gz (link)rabbitmq-bundle-0/pacemaker.log.txt.gz rabbitmq-bundle-0/pacemaker.log.txt.gz rabbitmq-bundle-0/pacemaker.log.txt.gz rabbitmq-bundle-0/pacemaker.log.txt.gz rabbitmq-bundle-0/pacemaker.log.txt.gz
1logs/subnode-2/var/log/audit/audit.log.1.gz (link)audit/audit.log.1.gz audit/audit.log.1.gz audit/audit.log.1.gz audit/audit.log.1.gz audit/audit.log.1.gz
9job-output.txt.gz (link)f30f641/job-output.txt.gz 40d2f9e/job-output.txt.gz 90a3ee0/job-output.txt.gz fa1049d/job-output.txt.gz ce6008e/job-output.txt.gz
1logs/subnode-2/var/log/extra/docker/docker_allinfo.log.txt.gz (link)docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz docker/docker_allinfo.log.txt.gz
2logs/undercloud/var/log/mistral/engine.log.txt.gz (link)mistral/engine.log.txt.gz mistral/engine.log.txt.gz mistral/engine.log.txt.gz mistral/engine.log.txt.gz mistral/engine.log.txt.gz
2logs/undercloud/var/log/mistral/executor.log.txt.gz (link)mistral/executor.log.txt.gz mistral/executor.log.txt.gz mistral/executor.log.txt.gz mistral/executor.log.txt.gz mistral/executor.log.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/nova_scheduler/log/nova/nova-compute.log.txt.gz (link)nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz
1logs/subnode-2/var/log/containers/nova/nova-compute.log.txt.gz (link)nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/nova_metadata/log/nova/nova-compute.log.txt.gz (link)nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz nova/nova-compute.log.txt.gz
2logs/subnode-2/var/log/containers/neutron/neutron-openvswitch-agent.log.txt.gz (link)neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz neutron/neutron-openvswitch-agent.log.txt.gz
9job-output.json.gz (link)f30f641/job-output.json.gz 40d2f9e/job-output.json.gz 90a3ee0/job-output.json.gz fa1049d/job-output.json.gz ce6008e/job-output.json.gz
1logs/undercloud/home/zuul/skip_file.gz (link)zuul/skip_file.gz zuul/skip_file.gz zuul/skip_file.gz zuul/skip_file.gz zuul/skip_file.gz
3logs/subnode-2/var/log/secure.txt.gz (link)log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz
1logs/subnode-2/var/log/openvswitch/ovs-vswitchd.log.txt.gz (link)openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz openvswitch/ovs-vswitchd.log.txt.gz
1logs/subnode-2/openvswitch/ovs-vswitchd.txt.gz (link)openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz openvswitch/ovs-vswitchd.txt.gz
56logs/subnode-2/var/log/journal.txt.gz (link)log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz
42logs/subnode-2/var/log/messages.txt.gz (link)log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz
2logs/syslog.txt.gz (link)logs/syslog.txt.gz logs/syslog.txt.gz logs/syslog.txt.gz logs/syslog.txt.gz logs/syslog.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/cinder_scheduler/docker_info.log.txt.gz (link)cinder_scheduler/docker_info.log.txt.gz cinder_scheduler/docker_info.log.txt.gz cinder_scheduler/docker_info.log.txt.gz cinder_scheduler/docker_info.log.txt.gz cinder_scheduler/docker_info.log.txt.gz
1logs/subnode-2/etc/puppet/hieradata/service_names.json.txt.gz (link)hieradata/service_names.json.txt.gz hieradata/service_names.json.txt.gz hieradata/service_names.json.txt.gz hieradata/service_names.json.txt.gz hieradata/service_names.json.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/galera-bundle-docker-0/docker_info.log.txt.gz (link)galera-bundle-docker-0/docker_info.log.txt.gz galera-bundle-docker-0/docker_info.log.txt.gz galera-bundle-docker-0/docker_info.log.txt.gz galera-bundle-docker-0/docker_info.log.txt.gz galera-bundle-docker-0/docker_info.log.txt.gz
1logs/delorean_logs/a7/7d/a77d26728da9220ffdb5b95f75ac5cd6cb56b0b0_dev/root.log.txt.gz (link)17d56060d738f8ca39f5b0564db0a8e491d27468_dev/root.log.txt.gz b755b4f35207e44050bb00a5ddb33d5626470991_dev/root.log.txt.gz 8391ef5e87658431fd106f6086b43821e1f1cd75_dev/root.log.txt.gz 27364130c71a42451c7f2ea0e4f86365bd72a99d_dev/root.log.txt.gz d6a8c4acc84a541993011a6ce95e634db5116ffd_dev/root.log.txt.gz 6b043b367d1f129a769b234dfd06653f71a9e688_dev/root.log.txt.gz 35f31138da38c785d0f5c07578d63fe63e93deca_dev/root.log.txt.gz 284a2e56a339ac64bba98ce9906c69bb6222dda1_dev/root.log.txt.gz
1logs/delorean_logs/a7/7d/a77d26728da9220ffdb5b95f75ac5cd6cb56b0b0_dev/rpmbuild.log.txt.gz (link)17d56060d738f8ca39f5b0564db0a8e491d27468_dev/rpmbuild.log.txt.gz b755b4f35207e44050bb00a5ddb33d5626470991_dev/rpmbuild.log.txt.gz 8391ef5e87658431fd106f6086b43821e1f1cd75_dev/rpmbuild.log.txt.gz 27364130c71a42451c7f2ea0e4f86365bd72a99d_dev/rpmbuild.log.txt.gz d6a8c4acc84a541993011a6ce95e634db5116ffd_dev/rpmbuild.log.txt.gz 6b043b367d1f129a769b234dfd06653f71a9e688_dev/rpmbuild.log.txt.gz 35f31138da38c785d0f5c07578d63fe63e93deca_dev/rpmbuild.log.txt.gz 284a2e56a339ac64bba98ce9906c69bb6222dda1_dev/rpmbuild.log.txt.gz
4logs/subnode-2/pip2-freeze.txt.gz (link)subnode-2/pip2-freeze.txt.gz subnode-2/pip2-freeze.txt.gz subnode-2/pip2-freeze.txt.gz subnode-2/pip2-freeze.txt.gz subnode-2/pip2-freeze.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/nova_virtlogd/stdout.log.txt.gz (link)nova_virtlogd/stdout.log.txt.gz nova_virtlogd/stdout.log.txt.gz nova_virtlogd/stdout.log.txt.gz nova_virtlogd/stdout.log.txt.gz nova_virtlogd/stdout.log.txt.gz
1logs/subnode-2/var/log/config-data/sensu/etc/sensu/conf.d/client.json.txt.gz (link)conf.d/client.json.txt.gz conf.d/client.json.txt.gz conf.d/client.json.txt.gz conf.d/client.json.txt.gz conf.d/client.json.txt.gz
2logs/subnode-2/var/log/extra/pip.txt.gz (link)extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz extra/pip.txt.gz
16logs/subnode-2/var/log/cluster/corosync.log.txt.gz (link)cluster/corosync.log.txt.gz cluster/corosync.log.txt.gz cluster/corosync.log.txt.gz cluster/corosync.log.txt.gz cluster/corosync.log.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/nova_compute/stdout.log.txt.gz (link)nova_compute/stdout.log.txt.gz nova_compute/stdout.log.txt.gz nova_compute/stdout.log.txt.gz nova_compute/stdout.log.txt.gz nova_compute/stdout.log.txt.gz
2logs/undercloud/var/log/secure.txt.gz (link)log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz log/secure.txt.gz
1logs/undercloud/home/zuul/tempest-setup.sh.txt.gz (link)zuul/tempest-setup.sh.txt.gz zuul/tempest-setup.sh.txt.gz zuul/tempest-setup.sh.txt.gz zuul/tempest-setup.sh.txt.gz zuul/tempest-setup.sh.txt.gz
1logs/subnode-2/var/log/config-data/neutron/etc/neutron/plugins/ml2/ml2_conf.ini.txt.gz (link)ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz ml2/ml2_conf.ini.txt.gz
2logs/subnode-2/var/log/extra/docker/containers/neutron_ovs_agent/docker_info.log.txt.gz (link)neutron_ovs_agent/docker_info.log.txt.gz neutron_ovs_agent/docker_info.log.txt.gz neutron_ovs_agent/docker_info.log.txt.gz neutron_ovs_agent/docker_info.log.txt.gz neutron_ovs_agent/docker_info.log.txt.gz
2logs/subnode-2/var/log/extra/rpm-list.txt.gz (link)extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz extra/rpm-list.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/aodh_api/log/httpd/aodh_wsgi_access.log.txt.gz (link)httpd/aodh_wsgi_access.log.txt.gz httpd/aodh_wsgi_access.log.txt.gz httpd/aodh_wsgi_access.log.txt.gz
1logs/subnode-2/var/log/containers/httpd/aodh-api/aodh_wsgi_access.log.txt.gz (link)aodh-api/aodh_wsgi_access.log.txt.gz aodh-api/aodh_wsgi_access.log.txt.gz aodh-api/aodh_wsgi_access.log.txt.gz aodh-api/aodh_wsgi_access.log.txt.gz aodh-api/aodh_wsgi_access.log.txt.gz
5logs/undercloud/var/log/journal.txt.gz (link)log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz log/journal.txt.gz
1logs/subnode-2/var/log/dmesg.txt.gz (link)log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz log/dmesg.txt.gz
1logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz (link)zuul/overcloud_deploy.log.txt.gz zuul/overcloud_deploy.log.txt.gz zuul/overcloud_deploy.log.txt.gz zuul/overcloud_deploy.log.txt.gz zuul/overcloud_deploy.log.txt.gz
5logs/subnode-2/rpm-qa.txt.gz (link)subnode-2/rpm-qa.txt.gz subnode-2/rpm-qa.txt.gz subnode-2/rpm-qa.txt.gz subnode-2/rpm-qa.txt.gz subnode-2/rpm-qa.txt.gz
2logs/subnode-2/var/log/extra/yum-list-installed.txt.gz (link)extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz extra/yum-list-installed.txt.gz
1logs/undercloud/var/log/neutron/dhcp-agent.log.txt.gz (link)neutron/dhcp-agent.log.txt.gz neutron/dhcp-agent.log.txt.gz neutron/dhcp-agent.log.txt.gz neutron/dhcp-agent.log.txt.gz neutron/dhcp-agent.log.txt.gz
1logs/undercloud/var/log/neutron/openvswitch-agent.log.txt.gz (link)neutron/openvswitch-agent.log.txt.gz neutron/openvswitch-agent.log.txt.gz neutron/openvswitch-agent.log.txt.gz neutron/openvswitch-agent.log.txt.gz neutron/openvswitch-agent.log.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/neutron_l3_agent/docker_info.log.txt.gz (link)neutron_l3_agent/docker_info.log.txt.gz neutron_l3_agent/docker_info.log.txt.gz neutron_l3_agent/docker_info.log.txt.gz neutron_l3_agent/docker_info.log.txt.gz neutron_l3_agent/docker_info.log.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/neutron_metadata_agent/docker_info.log.txt.gz (link)neutron_metadata_agent/docker_info.log.txt.gz neutron_metadata_agent/docker_info.log.txt.gz neutron_metadata_agent/docker_info.log.txt.gz neutron_metadata_agent/docker_info.log.txt.gz neutron_metadata_agent/docker_info.log.txt.gz
2logs/subnode-2/syslog.txt.gz (link)subnode-2/syslog.txt.gz subnode-2/syslog.txt.gz subnode-2/syslog.txt.gz subnode-2/syslog.txt.gz subnode-2/syslog.txt.gz
5logs/undercloud/var/log/messages.txt.gz (link)log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz log/messages.txt.gz
1logs/subnode-2/var/log/extra/docker/containers/neutron_dhcp/docker_info.log.txt.gz (link)neutron_dhcp/docker_info.log.txt.gz neutron_dhcp/docker_info.log.txt.gz neutron_dhcp/docker_info.log.txt.gz neutron_dhcp/docker_info.log.txt.gz neutron_dhcp/docker_info.log.txt.gz
3logs/undercloud/var/log/heat/heat-engine.log.txt.gz (link)heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz heat/heat-engine.log.txt.gz
1logs/log-size.txt.gz (link)logs/log-size.txt.gz logs/log-size.txt.gz logs/log-size.txt.gz logs/log-size.txt.gz logs/log-size.txt.gz
1logs/subnode-2/var/log/pacemaker/bundles/rabbitmq-bundle-0/rabbitmq/rabbit@centos-7-rax-iad-0000787869.log.txt.gz (link)rabbitmq/rabbit@centos-7-rax-iad-0000823500.log.txt.gz rabbitmq/rabbit@centos-7-rax-iad-0000823502.log.txt.gz rabbitmq/rabbit@centos-7-rax-iad-0000823502.log.txt.gz
1logs/devstack-gate-cleanup-host.txt (link)logs/devstack-gate-cleanup-host.txt logs/devstack-gate-cleanup-host.txt logs/devstack-gate-cleanup-host.txt logs/devstack-gate-cleanup-host.txt logs/devstack-gate-cleanup-host.txt

logs/quickstart_install.txt.gz
0.000 | 3019: TASK [validate-tempest : Tempest failed if rc code is not 0] *******************
0.000 | 3020: task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:2
0.000 | 3021: Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.678) 1:49:36.171 ****
0.211 | 3022: ok: [undercloud] => {"ansible_facts": {"tempest_status": "failed"}, "changed": false}
0.000 | 3023:

0.000 | 3035: task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:16
0.000 | 3036: Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.107) 1:49:36.696 ****
0.000 | 3037: ok: [undercloud] => {
0.270 | 3038: "tempest_status": "failed"
0.000 | 3039: }

0.000 | 3083: TASK [validate-tempest : Exit with tempest result code if configured] **********
0.000 | 3084: task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-results.yml:60
0.000 | 3085: Wednesday 08 November 2017 21:25:06 +0000 (0:00:02.655) 1:49:47.506 ****
0.693 | 3086: fatal: [undercloud]: FAILED! => {"changed": true, "cmd": "tail -10 tempest_output.log; exit 1", "delta": "0:00:00.006487", "end": "2017-11-08 21:25:06.982109", "failed": true, "rc": 1, "start": "2017-11-08 21:25:06.975622", "stderr": "", "stdout": "2017-11-08 21:24:31 | ==============
0.693 | 3086: 2017-11-08 21:24:31 | - Worker 0 (2 tests) => 0:10:29.829658
0.693 | 3086: 2017-11-08 21:24:31 | - Worker 1 (1 tests) => 0:06:13.937103
0.693 | 3086: 2017-11-08 21:24:31 |
0.693 | 3086: 2017-11-08 21:24:31 | Slowest Tests:
0.693 | 3086: 2017-11-08 21:24:31 | Test id Runtime (s)
0.693 | 3086: 2017-11-08 21:24:31 | ------------------------------------------------------------------------------------------------------------------------------------------------------ -----------
0.693 | 3086: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_autoscaling 614.010
0.693 | 3086: 2017-11-08 21:24:31 | tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,volume] 373.937
0.693 | 3086: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_aodh_gnocchi_threshold_alarm 15.819", ], "warnings": []}
0.000 | 3087:

logs/subnode-2/var/log/pacemaker/bundles/rabbitmq-bundle-0/pacemaker.log.txt.gz
0.000 | 0616: Nov 08 21:13:35 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: info: check_sbd_timeout: Watchdog functionality is consistent: (null) delay exceeds timeout of -1ms
0.000 | 0617: Nov 08 21:13:36 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: notice: check_sbd_timeout: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)
0.000 | 0618: Nov 08 21:13:36 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: info: check_sbd_timeout: Watchdog functionality is consistent: (null) delay exceeds timeout of -1ms
0.690 | 0619: Nov 08 21:13:36 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: warning: qb_ipcs_event_sendv: new_event_notification (13-32500-17): Broken pipe (32)
0.640 | 0620: Nov 08 21:13:36 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: warning: send_client_notify: Notification of client proxy-cib_rw-32500-23c89608/23c89608-4ae0-4282-889e-ee895be4abc5 failed
0.000 | 0621: Nov 08 21:13:51 [13] centos-7-rax-iad-0000787869 pacemaker_remoted: notice: check_sbd_timeout: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)

logs/subnode-2/var/log/audit/audit.log.1.gz
0.000 | 27416: type=ADD_GROUP msg=audit(1510174482.002:26725): pid=67502 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:groupadd_t:s0 msg='op=add-group id=165 exe="/usr/sbin/groupadd" hostname=? addr=? terminal=? res=success'
0.000 | 27417: type=GRP_MGMT msg=audit(1510174482.006:26726): pid=67502 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:groupadd_t:s0 msg='op=add-shadow-group id=165 exe="/usr/sbin/groupadd" hostname=? addr=? terminal=? res=success'
0.000 | 27418: type=ADD_USER msg=audit(1510174482.022:26727): pid=67507 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:useradd_t:s0 msg='op=add-user id=165 exe="/usr/sbin/useradd" hostname=? addr=? terminal=? res=success'
0.404 | 27419: type=USER_MGMT msg=audit(1510174482.023:26728): pid=67507 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:useradd_t:s0 msg='op=add-user-to-group grp="nobody" acct="cinder" exe="/usr/sbin/useradd" hostname=? addr=? terminal=? res=success'
0.450 | 27420: type=USER_MGMT msg=audit(1510174482.023:26729): pid=67507 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:useradd_t:s0 msg='op=add-user-to-group grp="cinder" acct="cinder" exe="/usr/sbin/useradd" hostname=? addr=? terminal=? res=success'
0.441 | 27421: type=USER_MGMT msg=audit(1510174482.023:26730): pid=67507 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:useradd_t:s0 msg='op=add-to-shadow-group grp="nobody" acct="cinder" exe="/usr/sbin/useradd" hostname=? addr=? terminal=? res=success'
0.473 | 27422: type=USER_MGMT msg=audit(1510174482.023:26731): pid=67507 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:useradd_t:s0 msg='op=add-to-shadow-group grp="cinder" acct="cinder" exe="/usr/sbin/useradd" hostname=? addr=? terminal=? res=success'
0.000 | 27423: type=USER_AUTH msg=audit(1510174486.985:26732): pid=67619 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:spc_t:s0 msg='op=PAM:authentication grantors=pam_rootok acct="rabbitmq" exe="/usr/bin/su" hostname=? addr=? terminal=? res=success'

job-output.txt.gz
0.000 | 9701: 2017-11-08 21:24:55.056701 | primary | TASK [validate-tempest : Tempest failed if rc code is not 0] *******************
0.000 | 9702: 2017-11-08 21:24:55.056770 | primary | task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:2
0.000 | 9703: 2017-11-08 21:24:55.092181 | primary | Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.678) 1:49:36.171 ****
0.228 | 9704: 2017-11-08 21:24:55.154558 | primary | ok: [undercloud] => {"ansible_facts": {"tempest_status": "failed"}, "changed": false}
0.000 | 9705: 2017-11-08 21:24:55.184038 | primary |

0.000 | 9717: 2017-11-08 21:24:55.591499 | primary | task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:16
0.000 | 9718: 2017-11-08 21:24:55.617123 | primary | Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.107) 1:49:36.696 ****
0.000 | 9719: 2017-11-08 21:24:55.675629 | primary | ok: [undercloud] => {
0.315 | 9720: 2017-11-08 21:24:55.675727 | primary | "tempest_status": "failed"
0.000 | 9721: 2017-11-08 21:24:55.675755 | primary | }

0.000 | 9765: 2017-11-08 21:25:06.404556 | primary | TASK [validate-tempest : Exit with tempest result code if configured] **********
0.000 | 9766: 2017-11-08 21:25:06.404632 | primary | task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-results.yml:60
0.000 | 9767: 2017-11-08 21:25:06.427296 | primary | Wednesday 08 November 2017 21:25:06 +0000 (0:00:02.655) 1:49:47.506 ****
0.662 | 9768: 2017-11-08 21:25:07.006472 | primary | fatal: [undercloud]: FAILED! => {"changed": true, "cmd": "tail -10 tempest_output.log; exit 1", "delta": "0:00:00.006487", "end": "2017-11-08 21:25:06.982109", "failed": true, "rc": 1, "start": "2017-11-08 21:25:06.975622", "stderr": "", "stdout": "2017-11-08 21:24:31 | ==============
0.662 | 9768: 2017-11-08 21:24:31 | - Worker 0 (2 tests) => 0:10:29.829658
0.662 | 9768: 2017-11-08 21:24:31 | - Worker 1 (1 tests) => 0:06:13.937103
0.662 | 9768: 2017-11-08 21:24:31 |
0.662 | 9768: 2017-11-08 21:24:31 | Slowest Tests:
0.662 | 9768: 2017-11-08 21:24:31 | Test id Runtime (s)
0.662 | 9768: 2017-11-08 21:24:31 | ------------------------------------------------------------------------------------------------------------------------------------------------------ -----------
0.662 | 9768: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_autoscaling 614.010
0.662 | 9768: 2017-11-08 21:24:31 | tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,volume] 373.937
0.662 | 9768: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_aodh_gnocchi_threshold_alarm 15.819", ], "warnings": []}
0.000 | 9769: 2017-11-08 21:25:07.028920 | primary |

0.000 | 9817: 2017-11-08 21:25:07.127128 | primary | /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/install-built-repo/tasks/install_built_repo.yml:1
0.000 | 9818: 2017-11-08 21:25:07.363433 | primary | +(./toci_quickstart.sh:72): exit_value=2
0.000 | 9819: 2017-11-08 21:25:07.364478 | primary | +(./toci_quickstart.sh:74): [[ 2 == 0 ]]
0.293 | 9820: 2017-11-08 21:25:07.364590 | primary | +(./toci_quickstart.sh:74): echo 'Playbook run failed'
0.441 | 9821: 2017-11-08 21:25:07.364623 | primary | Playbook run failed
0.000 | 9822: 2017-11-08 21:25:07.364699 | primary | +(./toci_quickstart.sh:78): sed -i 's/hosts: all:!localhost/hosts: all:!localhost:!127.0.0.2/' /home/zuul/workspace/.quickstart/playbooks/collect-logs.yml

0.000 | 9867: 2017-11-08 21:32:19.486095 | primary | +(./toci_quickstart.sh:115): echo 'Quickstart completed.'
0.000 | 9868: 2017-11-08 21:32:19.486126 | primary | Quickstart completed.
0.000 | 9869: 2017-11-08 21:32:19.486160 | primary | +(./toci_quickstart.sh:116): exit 2
0.577 | 9870: 2017-11-08 21:32:19.505871 | primary | ERROR: the main setup script run by this job failed - exit code: 2
0.584 | 9871: 2017-11-08 21:32:19.506688 | primary | please look at the relevant log files to determine the root cause
0.353 | 9872: 2017-11-08 21:32:19.506765 | primary | Running devstack worlddump.py
0.000 | 9873: 2017-11-08 21:32:25.275933 | primary | Cleaning up host

0.097 | 9900: 2017-11-08 21:32:46.147326 | primary | ">f.st...... etc/cinder/cinder.conf.txt.gz",
0.138 | 9901: 2017-11-08 21:32:46.147365 | primary | ">f.st...... etc/cinder/policy.json.txt.gz",
0.156 | 9902: 2017-11-08 21:32:46.147405 | primary | ">f.st...... etc/cinder/rootwrap.conf.txt.gz",
0.218 | 9903: 2017-11-08 21:32:46.147452 | primary | ".d..t...... etc/cinder/rootwrap.d/",
0.157 | 9904: 2017-11-08 21:32:46.147496 | primary | ">f+++++++++ etc/cinder/rootwrap.d/os-brick.filters",
0.148 | 9905: 2017-11-08 21:32:46.147531 | primary | "cd+++++++++ etc/cinder/volumes/",
0.435 | 9906: 2017-11-08 21:32:46.147564 | primary | "cd+++++++++ glusterfs/",
0.000 | 9907: 2017-11-08 21:32:46.147595 | primary | "cd+++++++++ libvirt/",

0.000 | 9915: 2017-11-08 21:32:46.147895 | primary | "cd+++++++++ sudoers.d/",
0.000 | 9916: 2017-11-08 21:32:46.147932 | primary | ">f+++++++++ sudoers.d/50_stack_sh.txt.gz",
0.000 | 9917: 2017-11-08 21:32:46.147971 | primary | ">f+++++++++ sudoers.d/51_tempest_sh.txt.gz",
0.204 | 9918: 2017-11-08 21:32:46.148007 | primary | ">f+++++++++ sudoers.d/cinder.txt.gz",
0.000 | 9919: 2017-11-08 21:32:46.148047 | primary | ">f+++++++++ sudoers.d/jenkins-sudo-grep.txt.gz",

0.000 | 9925: 2017-11-08 21:32:46.148273 | primary | }
0.000 | 9926: 2017-11-08 21:32:47.210073 | primary | Generating static files at /home/zuul/workspace/logs/ara...
0.000 | 9927: 2017-11-08 21:32:50.366120 | primary | Done.
0.616 | 9928: 2017-11-08 21:32:52.315138 | primary | *** FAILED with status: 2
0.435 | 9929: 2017-11-08 21:32:53.182868 | primary | ERROR
0.000 | 9930: 2017-11-08 21:32:53.188767 | primary | {
0.663 | 9931: 2017-11-08 21:32:53.188873 | primary | "delta": "2:09:39.844394",
0.000 | 9932: 2017-11-08 21:32:53.188963 | primary | "end": "2017-11-08 21:32:52.391061",
0.319 | 9933: 2017-11-08 21:32:53.189048 | primary | "failed": true,
0.000 | 9934: 2017-11-08 21:32:53.189131 | primary | "rc": 2,
0.298 | 9935: 2017-11-08 21:32:53.189214 | primary | "start": "2017-11-08 19:23:12.546667"
0.000 | 9936: 2017-11-08 21:32:53.189295 | primary | }

0.000 | 14372: 2017-11-08 21:33:07.987586 | primary | >f+++++++++ logs/subnode-2/var/log/audit/audit.log.1.gz
0.000 | 14373: 2017-11-08 21:33:07.987661 | primary | >f+++++++++ logs/subnode-2/var/log/audit/audit.log.txt.gz
0.101 | 14374: 2017-11-08 21:33:07.987737 | primary | cd+++++++++ logs/subnode-2/var/log/cinder/
0.326 | 14375: 2017-11-08 21:33:07.987813 | primary | >f+++++++++ logs/subnode-2/var/log/cinder/backup.log.txt.gz
0.091 | 14376: 2017-11-08 21:33:07.987889 | primary | >f+++++++++ logs/subnode-2/var/log/cinder/volume.log.txt.gz

logs/subnode-2/var/log/extra/docker/docker_allinfo.log.txt.gz
0.000 | 0285: Debug Mode (client): false
0.000 | 0286: Debug Mode (server): false
0.000 | 0287: Registry: https://index.docker.io/v1/
1.000 | 0288: WARNING: bridge-nf-call-iptables is disabled
0.000 | 0289: WARNING: bridge-nf-call-ip6tables is disabled

logs/undercloud/var/log/mistral/engine.log.txt.gz
0.000 | 5582: urViPtmOyn3zNVZjnlDJa2RyKnJsIu7B70bHx77m2+spaKnkuAZuwlsNjKPDO39i
0.000 | 5583: FjlbHYuLNHE1Rqp3sNiXOqrj5uQLbMbvuIpncAif7M/PUOGvvUfdbOVlblbrhkXX
0.000 | 5584: UttOdcIuVCwFKIgjxqEAgzFhbzn+svGf618CeoIkUQmM5QDmwTgH+KAVosAlGitq
1.000 | 5585: kGyiB+3z/kpMYhAB+Gbjw1VFQocnFQWvtjVF6MN7uhuljkyI+5VsMAXBtkgJDPhU
0.000 | 5586: Kz57hFTtTJDZt5WV1D31uKmQ76t+Xs6uCcxyt+d6CYLnOkAWNGATxN0PiL0OTPkq

0.000 | 5597: Be506xYS6qZVDsTA0HQr29FpLiGtH2RHju4P3GOeGkfGTSVRNhJbEPhyoHFyPYfP
0.000 | 5598: lmeAk9AN/88nLjbAZyuNtszTFZQMnx30mvwZ64cjw0+tQYxJ7m0CggEBAO8yCGNl
0.000 | 5599: CfWH9nwPOd/ZTRHjC+ZjrnFunaUjsEu5OJhlnvrigac1nzFlLmJJ43rxgsHMkV60
1.000 | 5600: Txm7/W+qsqwN9NuQ6Ktyf9DeOy8/MRghQZzwcBfO+nIgEhz+pfH4sXOlXqMYCI6W
0.000 | 5601: USI1v7WvY8epy4k8zotbQp/vuv834kBFK5Uz8VRZUfGsg60aRGySCTVRGaaSlZWt

logs/undercloud/var/log/mistral/executor.log.txt.gz
0.000 | 4607: urViPtmOyn3zNVZjnlDJa2RyKnJsIu7B70bHx77m2+spaKnkuAZuwlsNjKPDO39i
0.000 | 4608: FjlbHYuLNHE1Rqp3sNiXOqrj5uQLbMbvuIpncAif7M/PUOGvvUfdbOVlblbrhkXX
0.000 | 4609: UttOdcIuVCwFKIgjxqEAgzFhbzn+svGf618CeoIkUQmM5QDmwTgH+KAVosAlGitq
1.000 | 4610: kGyiB+3z/kpMYhAB+Gbjw1VFQocnFQWvtjVF6MN7uhuljkyI+5VsMAXBtkgJDPhU
0.000 | 4611: Kz57hFTtTJDZt5WV1D31uKmQ76t+Xs6uCcxyt+d6CYLnOkAWNGATxN0PiL0OTPkq

0.000 | 4622: Be506xYS6qZVDsTA0HQr29FpLiGtH2RHju4P3GOeGkfGTSVRNhJbEPhyoHFyPYfP
0.000 | 4623: lmeAk9AN/88nLjbAZyuNtszTFZQMnx30mvwZ64cjw0+tQYxJ7m0CggEBAO8yCGNl
0.000 | 4624: CfWH9nwPOd/ZTRHjC+ZjrnFunaUjsEu5OJhlnvrigac1nzFlLmJJ43rxgsHMkV60
1.000 | 4625: Txm7/W+qsqwN9NuQ6Ktyf9DeOy8/MRghQZzwcBfO+nIgEhz+pfH4sXOlXqMYCI6W
0.000 | 4626: USI1v7WvY8epy4k8zotbQp/vuv834kBFK5Uz8VRZUfGsg60aRGySCTVRGaaSlZWt

logs/subnode-2/var/log/extra/docker/containers/nova_scheduler/log/nova/nova-compute.log.txt.gz
0.000 | 1428: <source bridge="br-int"/>
0.000 | 1429: <target dev="tap66b0ab75-0c"/>
0.000 | 1430: <virtualport type="openvswitch">
1.000 | 1431: <parameters interfaceid="66b0ab75-0cd5-4b4d-bb0c-af39371071cc"/>
0.000 | 1432: </virtualport>

logs/subnode-2/var/log/containers/nova/nova-compute.log.txt.gz
0.000 | 1428: <source bridge="br-int"/>
0.000 | 1429: <target dev="tap66b0ab75-0c"/>
0.000 | 1430: <virtualport type="openvswitch">
1.000 | 1431: <parameters interfaceid="66b0ab75-0cd5-4b4d-bb0c-af39371071cc"/>
0.000 | 1432: </virtualport>

logs/subnode-2/var/log/extra/docker/containers/nova_metadata/log/nova/nova-compute.log.txt.gz
0.000 | 1428: <source bridge="br-int"/>
0.000 | 1429: <target dev="tap66b0ab75-0c"/>
0.000 | 1430: <virtualport type="openvswitch">
1.000 | 1431: <parameters interfaceid="66b0ab75-0cd5-4b4d-bb0c-af39371071cc"/>
0.000 | 1432: </virtualport>

logs/subnode-2/var/log/containers/neutron/neutron-openvswitch-agent.log.txt.gz
0.000 | 2540: 2017-11-08 21:14:13.353 113692 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:110
0.000 | 2541: 2017-11-08 21:14:13.354 113692 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn command(idx=0): DbGetCommand(column=other_config, table=Port, record=tap66b0ab75-0c) do_commit /usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:84
0.000 | 2542: 2017-11-08 21:14:13.354 113692 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:110
0.502 | 2543: 2017-11-08 21:14:13.356 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] Creating flow rules for port 66b0ab75-0cd5-4b4d-bb0c-af39371071cc that is port 15 in OVS add_flows_from_rules /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:1010
0.563 | 2544: 2017-11-08 21:14:13.356 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] RULGEN: Rules generated for flow {'ethertype': u'IPv6', 'direction': u'egress'} are [{'priority': 70, 'dl_type': 34525, 'actions': 'resubmit(,73)', 'reg_port': 15, 'table': 72}] add_flows_from_rules /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:1014
0.000 | 2545: 2017-11-08 21:14:13.357 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] RULGEN: Rules generated for flow {'ethertype': u'IPv4', 'direction': u'egress'} are [{'priority': 70, 'dl_type': 2048, 'actions': 'resubmit(,73)', 'reg_port': 15, 'table': 72}] add_flows_from_rules /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:1014
0.563 | 2546: 2017-11-08 21:14:13.357 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] RULGEN: Rules generated for flow {'ethertype': u'IPv4', 'direction': u'ingress', 'source_ip_prefix': '0.0.0.0/0', 'protocol': 1} are [{'dl_type': 2048, 'reg_port': 15, 'actions': 'output:15', 'priority': 70, 'nw_proto': 1, 'table': 82}] add_flows_from_rules /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:1014
0.563 | 2547: 2017-11-08 21:14:13.358 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] RULGEN: Rules generated for flow {'direction': u'ingress', 'protocol': 6, 'ethertype': u'IPv4', 'port_range_max': 22, 'source_ip_prefix': '0.0.0.0/0', 'port_range_min': 22} are [{'dl_type': 2048, 'reg_port': 15, 'nw_proto': 6, 'tcp_dst': '0x0016', 'table': 82, 'actions': 'output:15', 'priority': 70}] add_flows_from_rules /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:1014
0.000 | 2548: 2017-11-08 21:14:13.359 113692 DEBUG neutron.agent.linux.utils [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] Running command (rootwrap daemon): ['ovs-ofctl', 'add-flows', '-O', 'OpenFlow10', 'br-int', '-'] execute_rootwrap_daemon /usr/lib/python2.7/site-packages/neutron/agent/linux/utils.py:108

0.000 | 3318: 2017-11-08 21:19:43.398 113692 DEBUG neutron.plugins.ml2.drivers.openvswitch.agent.extension_drivers.qos_driver [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] delete_bandwidth_limit_ingress was received for port 0c771f1c-e3b9-4811-b9aa-ddb2a9799f24 but vif_port was not found. It seems that port is already deleted delete_bandwidth_limit_ingress /usr/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/extension_drivers/qos_driver.py:79
0.000 | 3319: 2017-11-08 21:19:43.398 113692 INFO neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] port_unbound(): net_uuid None not managed by VLAN manager
0.000 | 3320: 2017-11-08 21:19:43.399 113692 INFO neutron.agent.securitygroups_rpc [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] Remove device filter for ['0c771f1c-e3b9-4811-b9aa-ddb2a9799f24']
0.530 | 3321: 2017-11-08 21:19:43.400 113692 DEBUG neutron.agent.linux.openvswitch_firewall.firewall [req-8a760281-5495-43e0-957e-de04bc82ad1f - - - - -] Port 0c771f1c-e3b9-4811-b9aa-ddb2a9799f24 is not handled by the firewall. _remove_egress_no_port_security /usr/lib/python2.7/site-packages/neutron/agent/linux/openvswitch_firewall/firewall.py:677
0.000 | 3322: 2017-11-08 21:19:43.401 113692 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn command(idx=0): ListPortsCommand(bridge=br-int) do_commit /usr/lib/python2.7/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:84

job-output.json.gz
0.000 | 96242: "TASK [validate-tempest : Tempest failed if rc code is not 0] *******************",
0.000 | 96243: "task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:2",
0.000 | 96244: "Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.678) 1:49:36.171 **** ",
0.267 | 96245: "ok: [undercloud] => {\"ansible_facts\": {\"tempest_status\": \"failed\"}, \"changed\": false}",
0.000 | 96246: "",

0.000 | 96258: "task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-status.yml:16",
0.000 | 96259: "Wednesday 08 November 2017 21:24:55 +0000 (0:00:00.107) 1:49:36.696 **** ",
0.000 | 96260: "ok: [undercloud] => {",
0.341 | 96261: " \"tempest_status\": \"failed\"",
0.000 | 96262: "}",

0.000 | 96306: "TASK [validate-tempest : Exit with tempest result code if configured] **********",
0.000 | 96307: "task path: /home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/validate-tempest/tasks/tempest-results.yml:60",
0.000 | 96308: "Wednesday 08 November 2017 21:25:06 +0000 (0:00:02.655) 1:49:47.506 **** ",
0.618 | 96309: "fatal: [undercloud]: FAILED! => {\"changed\": true, \"cmd\": \"tail -10 tempest_output.log; exit 1\", \"delta\": \"0:00:00.006487\", \"end\": \"2017-11-08 21:25:06.982109\", \"failed\": true, \"rc\": 1, \"start\": \"2017-11-08 21:25:06.975622\", \"stderr\": \"\", \"stdout\": \"2017-11-08 21:24:31 | ==============\
0.618 | 96309: 2017-11-08 21:24:31 | - Worker 0 (2 tests) => 0:10:29.829658\
0.618 | 96309: 2017-11-08 21:24:31 | - Worker 1 (1 tests) => 0:06:13.937103\
0.618 | 96309: 2017-11-08 21:24:31 | \
0.618 | 96309: 2017-11-08 21:24:31 | Slowest Tests:\
0.618 | 96309: 2017-11-08 21:24:31 | Test id Runtime (s)\
0.618 | 96309: 2017-11-08 21:24:31 | ------------------------------------------------------------------------------------------------------------------------------------------------------ -----------\
0.618 | 96309: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_autoscaling 614.010\
0.618 | 96309: 2017-11-08 21:24:31 | tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,volume] 373.937\
0.618 | 96309: 2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_aodh_gnocchi_threshold_alarm 15.819\", \"stdout_lines\": [\"2017-11-08 21:24:31 | ==============\", \"2017-11-08 21:24:31 | - Worker 0 (2 tests) => 0:10:29.829658\", \"2017-11-08 21:24:31 | - Worker 1 (1 tests) => 0:06:13.937103\", \"2017-11-08 21:24:31 | \", \"2017-11-08 21:24:31 | Slowest Tests:\", \"2017-11-08 21:24:31 | Test id Runtime (s)\", \"2017-11-08 21:24:31 | ------------------------------------------------------------------------------------------------------------------------------------------------------ -----------\", \"2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_autoscaling 614.010\", \"2017-11-08 21:24:31 | tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern[compute,id-557cd2c2-4eb8-4dce-98be-f86765ff311b,image,volume] 373.937\", \"2017-11-08 21:24:31 | ceilometer.tests.tempest.scenario.test_telemetry_integration.TestTelemetryIntegration.test_aodh_gnocchi_threshold_alarm 15.819\"], \"warnings\": []}",
0.000 | 96310: "",

0.000 | 96358: "/home/zuul/workspace/.quickstart/usr/local/share/ansible/roles/install-built-repo/tasks/install_built_repo.yml:1 ",
0.000 | 96359: "+(./toci_quickstart.sh:72): exit_value=2",
0.000 | 96360: "+(./toci_quickstart.sh:74): [[ 2 == 0 ]]",
0.297 | 96361: "+(./toci_quickstart.sh:74): echo 'Playbook run failed'",
0.461 | 96362: "Playbook run failed",
0.000 | 96363: "+(./toci_quickstart.sh:78): sed -i 's/hosts: all:!localhost/hosts: all:!localhost:!127.0.0.2/' /home/zuul/workspace/.quickstart/playbooks/collect-logs.yml",

0.000 | 96408: "+(./toci_quickstart.sh:115): echo 'Quickstart completed.'",
0.000 | 96409: "Quickstart completed.",
0.000 | 96410: "+(./toci_quickstart.sh:116): exit 2",
0.589 | 96411: "ERROR: the main setup script run by this job failed - exit code: 2",
0.689 | 96412: " please look at the relevant log files to determine the root cause",
0.349 | 96413: "Running devstack worlddump.py",
0.000 | 96414: "Cleaning up host",

0.000 | 96456: " \"cd+++++++++ sudoers.d/\", ",
0.000 | 96457: " \">f+++++++++ sudoers.d/50_stack_sh.txt.gz\", ",
0.000 | 96458: " \">f+++++++++ sudoers.d/51_tempest_sh.txt.gz\", ",
0.267 | 96459: " \">f+++++++++ sudoers.d/cinder.txt.gz\", ",
0.000 | 96460: " \">f+++++++++ sudoers.d/jenkins-sudo-grep.txt.gz\", ",

0.000 | 96466: "}",
0.000 | 96467: "Generating static files at /home/zuul/workspace/logs/ara...",
0.000 | 96468: "Done.",
0.678 | 96469: "*** FAILED with status: 2"
0.000 | 96470: ],

0.000 | 98874: ">f+++++++++ logs/subnode-2/etc/ceph/rbdmap.gz",
0.000 | 98875: "cd+++++++++ logs/subnode-2/etc/ci/",
0.000 | 98876: ">f+++++++++ logs/subnode-2/etc/ci/mirror_info.sh.txt.gz",
0.220 | 98877: "cd+++++++++ logs/subnode-2/etc/cinder/",
0.098 | 98878: ">f+++++++++ logs/subnode-2/etc/cinder/api-paste.ini.txt.gz",

0.000 | 101079: ">f+++++++++ logs/subnode-2/var/log/audit/audit.log.1.gz",
0.000 | 101080: ">f+++++++++ logs/subnode-2/var/log/audit/audit.log.txt.gz",
0.116 | 101081: "cd+++++++++ logs/subnode-2/var/log/cinder/",
0.300 | 101082: ">f+++++++++ logs/subnode-2/var/log/cinder/backup.log.txt.gz",
0.096 | 101083: ">f+++++++++ logs/subnode-2/var/log/cinder/volume.log.txt.gz",

logs/undercloud/home/zuul/skip_file.gz
0.000 | 0060: tempest.api.identity.admin.v3.test_roles.RolesV3TestJSON.test_implied_roles_create_check_show_delete
0.000 | 0061: # test is failing, it is being investigated
0.000 | 0062: # https://bugs.launchpad.net/tripleo/+bug/1714660
0.597 | 0063: tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_mtu_sized_frames

logs/subnode-2/var/log/secure.txt.gz
0.000 | 2386: Nov 8 20:54:35 centos-7-rax-iad-0000787869 su: pam_unix(su:session): session closed for user rabbitmq
0.137 | 2387: Nov 8 20:54:42 centos-7-rax-iad-0000787869 groupadd[67502]: group added to /etc/group: name=cinder, GID=165
0.159 | 2388: Nov 8 20:54:42 centos-7-rax-iad-0000787869 groupadd[67502]: group added to /etc/gshadow: name=cinder
0.255 | 2389: Nov 8 20:54:42 centos-7-rax-iad-0000787869 groupadd[67502]: new group: name=cinder, GID=165
0.297 | 2390: Nov 8 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: new user: name=cinder, UID=165, GID=165, home=/var/lib/cinder, shell=/sbin/nologin
0.162 | 2391: Nov 8 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to group 'nobody'
0.290 | 2392: Nov 8 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to group 'cinder'
0.125 | 2393: Nov 8 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to shadow group 'nobody'
0.346 | 2394: Nov 8 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to shadow group 'cinder'
0.000 | 2395: Nov 8 20:54:46 centos-7-rax-iad-0000787869 su: pam_systemd(su:session): Failed to connect to system bus: No such file or directory

0.000 | 2850: Nov 8 21:06:11 centos-7-rax-iad-0000787869 su: pam_unix(su:session): session closed for user rabbitmq
0.124 | 2851: Nov 8 21:06:16 centos-7-rax-iad-0000787869 groupadd[109211]: group added to /etc/group: name=ceph, GID=167
0.145 | 2852: Nov 8 21:06:16 centos-7-rax-iad-0000787869 groupadd[109211]: group added to /etc/gshadow: name=ceph
0.236 | 2853: Nov 8 21:06:16 centos-7-rax-iad-0000787869 groupadd[109211]: new group: name=ceph, GID=167
0.276 | 2854: Nov 8 21:06:16 centos-7-rax-iad-0000787869 useradd[109215]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin
0.000 | 2855: Nov 8 21:06:23 centos-7-rax-iad-0000787869 su: pam_systemd(su:session): Failed to connect to system bus: No such file or directory

0.011 | 5155: Nov 8 21:27:15 centos-7-rax-iad-0000787869 sudo: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/sbin/pcs stonith show --full
0.000 | 5156: Nov 8 21:27:16 centos-7-rax-iad-0000787869 sudo: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.016 | 5157: Nov 8 21:27:16 centos-7-rax-iad-0000787869 sudo: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/sbin/crm_verify -L -VVVVVV
0.444 | 5158: Nov 8 21:27:16 centos-7-rax-iad-0000787869 sudo: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/ceph status
0.000 | 5159: Nov 8 21:27:16 centos-7-rax-iad-0000787869 sudo: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/facter

logs/subnode-2/var/log/openvswitch/ovs-vswitchd.log.txt.gz
0.000 | 0142: 2017-11-08T21:14:22.001Z|00142|connmgr|INFO|br-int<->unix: 54 flow_mods in the last 0 s (54 adds)
0.000 | 0143: 2017-11-08T21:14:24.581Z|00143|connmgr|INFO|br-tun<->tcp:127.0.0.1:6633: 2 flow_mods in the 1 s starting 53 s ago (2 adds)
0.000 | 0144: 2017-11-08T21:14:28.623Z|00144|connmgr|INFO|br-int<->tcp:127.0.0.1:6633: 10 flow_mods in the 52 s starting 59 s ago (10 deletes)
0.428 | 0145: 2017-11-08T21:19:46.803Z|00145|ofproto|WARN|br-int: cannot get STP status on nonexistent port 15
0.428 | 0146: 2017-11-08T21:19:46.803Z|00146|ofproto|WARN|br-int: cannot get RSTP status on nonexistent port 15
0.000 | 0147: 2017-11-08T21:19:46.805Z|00147|bridge|INFO|bridge br-int: deleted interface tap66b0ab75-0c on port 15
0.293 | 0148: 2017-11-08T21:19:46.815Z|00148|bridge|WARN|could not open network device tap66b0ab75-0c (No such device)
0.000 | 0149: 2017-11-08T21:19:47.592Z|00149|connmgr|INFO|br-int<->unix: 1 flow_mods in the last 0 s (1 deletes)

logs/subnode-2/openvswitch/ovs-vswitchd.txt.gz
0.000 | 0142: 2017-11-08T21:14:22.001Z|00142|connmgr|INFO|br-int<->unix: 54 flow_mods in the last 0 s (54 adds)
0.000 | 0143: 2017-11-08T21:14:24.581Z|00143|connmgr|INFO|br-tun<->tcp:127.0.0.1:6633: 2 flow_mods in the 1 s starting 53 s ago (2 adds)
0.000 | 0144: 2017-11-08T21:14:28.623Z|00144|connmgr|INFO|br-int<->tcp:127.0.0.1:6633: 10 flow_mods in the 52 s starting 59 s ago (10 deletes)
0.428 | 0145: 2017-11-08T21:19:46.803Z|00145|ofproto|WARN|br-int: cannot get STP status on nonexistent port 15
0.428 | 0146: 2017-11-08T21:19:46.803Z|00146|ofproto|WARN|br-int: cannot get RSTP status on nonexistent port 15
0.000 | 0147: 2017-11-08T21:19:46.805Z|00147|bridge|INFO|bridge br-int: deleted interface tap66b0ab75-0c on port 15
0.293 | 0148: 2017-11-08T21:19:46.815Z|00148|bridge|WARN|could not open network device tap66b0ab75-0c (No such device)
0.000 | 0149: 2017-11-08T21:19:47.592Z|00149|connmgr|INFO|br-int<->unix: 1 flow_mods in the last 0 s (1 deletes)

logs/subnode-2/var/log/journal.txt.gz
0.000 | 7826: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: changed: [localhost] => (item={'key': u'step_1', 'value': {u'mysql_image_tag': {u'start_order': 2, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-mariadb:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'mysql_data_ownership': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'chown', u'-R', u'mysql:', u'/var/lib/mysql'], u'user': u'root', u'volumes': [u'/var/lib/mysql:/var/lib/mysql'], u'net': u'host', u'detach': False}, u'memcached_init_logs': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-memcached:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'source /etc/sysconfig/memcached; touch /var/log/memcached.log && chown ${USER} /var/log/memcached.log'], u'user': u'root', u'volumes': [u'/var/lib/config-data/memcached/etc/sysconfig/memcached:/etc/sysconfig/memcached:ro', u'/var/log/containers/memcached:/var/log/'], u'detach': False, u'privileged': False}, u'redis_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-redis:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-redis:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-redis:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro',
0.000 | 7827: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'mysql_bootstrap': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS', u'KOLLA_BOOTSTRAP=True', u'KOLLA_KUBERNETES=True', u'DB_MAX_TIMEOUT=60', u'DB_CLUSTERCHECK_PASSWORD=AfXTmCmAbpj8qFgpMZFdGhZym', u'DB_ROOT_PASSWORD=HDARRY0WrF'], u'command': [u'bash', u'-ecx', u'if [ -e /var/lib/mysql/mysql ]; then exit 0; fi
0.000 | 7827: echo -e "\
0.000 | 7827: [mysqld]\
0.000 | 7827: wsrep_provider=none" >> /etc/my.cnf
0.000 | 7827: sudo -u mysql -E kolla_start
0.000 | 7827: mysqld_safe --skip-networking --wsrep-on=OFF &
0.000 | 7827: timeout ${DB_MAX_TIMEOUT} /bin/bash -c \'until mysqladmin -uroot -p"${DB_ROOT_PASSWORD}" ping 2>/dev/null; do sleep 1; done\'
0.000 | 7827: mysql -uroot -p"${DB_ROOT_PASSWORD}" -e "CREATE USER \'clustercheck\'@\'localhost\' IDENTIFIED BY \'${DB_CLUSTERCHECK_PASSWORD}\';"
0.000 | 7827: mysql -uroot -p"${DB_ROOT_PASSWORD}" -e "GRANT PROCESS ON *.* TO \'clustercheck\'@\'localhost\' WITH GRANT OPTION;"
0.000 | 7827: timeout ${DB_MAX_TIMEOUT} mysqladmin -uroot -p"${DB_ROOT_PASSWORD}" shutdown'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/kolla/config_files/mysql.json:/var/lib/kolla/config_files/config.json', u'/var/lib/config-data/puppet-generated/mysql/:/var/lib/kolla/config_files/src:ro', u'/var/lib/mysql:/var/lib/mysql'], u'net': u'host', u'detach': False}, u'haproxy_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos
0.000 | 7828: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: -binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'rabbitmq_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'rabbitmq_bootstrap': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS', u'KOLLA_BOOTSTRAP=True', u'RABBITMQ_CLUSTER_COOKIE=6X4AV5uqrpsuRbEp23FL'], u'volumes': [u'/var/lib/kolla/config_files/rabbitmq.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/rabbitmq/:/var/lib/kolla/config_files/src:ro', u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/var/lib/rabbitmq:/var/lib/rabbitmq'], u'net': u'host', u'privileged': False}, u'memcached': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-memcached:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'
0.365 | 7829: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: command': [u'/bin/bash', u'-c', u'source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS >> /var/log/memcached.log 2>&1'], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/config-data/memcached/etc/sysconfig/memcached:/etc/sysconfig/memcached:ro', u'/var/log/containers/memcached:/var/log/'], u'net': u'host', u'privileged': False, u'restart': u'always'}}})
0.000 | 7830: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: changed: [localhost] => (item={'key': u'step_3', 'value': {u'nova_placement': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-placement-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-placement:/var/log/httpd', u'/var/lib/kolla/config_files/nova_placement.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/nova_placement/:/var/lib/kolla/config_files/src:ro', u'', u''], u'net': u'host', u'restart': u'always'}, u'nova_db_sync': {u'start_order': 3, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage db sync'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-

0.000 | 7838: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: /containers/cinder:/var/log/cinder', u'/var/log/containers/httpd/cinder-api:/var/log/httpd'], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_api_map_cell0': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage cell_v2 map_cell0'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/nova/etc/nova/:/etc/nova/:ro'], u'net': u'host', u'detach': False}, u'glance_api_db_sync': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-glance-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_BOOTSTRAP=True', u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'command': u"/usr/bin/bootstrap_host_exec glance_api su glance -s /bin/bash -c '/usr/local/bin/kolla_start'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/glance:/var/log/glance', u
0.000 | 7839: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: '/var/log/containers/httpd/glance-api:/var/log/httpd', u'/var/lib/kolla/config_files/glance_api.json:/var/lib/kolla/config_files/config.json', u'/var/lib/config-data/puppet-generated/glance_api/:/var/lib/kolla/config_files/src:ro', u'/etc/ceph:/var/lib/kolla/config_files/src-ceph:ro', u''], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_api_create_default_cell': {u'start_order': 2, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage cell_v2 create_cell --name=default'", u'exit_codes': [0, 2], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/nova/etc/nova/:/etc/nova/:ro'], u'net': u'host', u'detach': False, u'user': u'root'}, u'neutron_db_sync': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/usr/bin/bootstrap_host_exec', u'neutron_api', u'neutron-db-manage', u'upgrade', u'heads'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls
0.000 | 7840: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: /cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/neutron:/var/log/neutron', u'/var/log/containers/httpd/neutron-api:/var/log/httpd', u'/var/lib/config-data/neutron/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/neutron/etc/neutron:/etc/neutron:ro', u'/var/lib/config-data/neutron/usr/share/neutron:/usr/share/neutron:ro'], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_virtlogd': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-libvirt:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'pid': u'host', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/nova_libvirt/:/var/lib/kolla/config_files/src:ro', u'/lib/modules:/lib/modules:ro', u'/dev:/dev', u'/run:/run', u'/sys/fs/cgroup:/sys/fs/cgroup', u'/var/lib/nova:/var/lib/nova:shared', u'/var/run/libvirt:/var/run/libvirt', u'/var/lib/libvirt:/var/lib/libvirt', u'/etc/libvirt/qemu:/etc/libvirt/qemu:ro', u'/var/log/libvirt/qemu:/var/log/libvirt/qemu'], u'net': u'host', u'privileged': True, u'restart': u'always'}, u'sensu_client': {u'healthcheck': {u'test': u'/openstack/healthcheck'}, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-sensu-client:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/ho
0.269 | 7841: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: sts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw', u'/var/lib/kolla/config_files/sensu-client.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/sensu/:/var/lib/kolla/config_files/src:ro', u'/var/log/containers/sensu:/var/log/sensu:rw'], u'net': u'host', u'privileged': True, u'restart': u'always'}, u'keystone_bootstrap': {u'action': u'exec', u'start_order': 3, u'command': [u'keystone', u'/usr/bin/bootstrap_host_exec', u'keystone', u'keystone-manage', u'bootstrap', u'--bootstrap-password', u'fkUCscvWTfR2ydGbkvyzg8Tas'], u'user': u'root'}}})
0.000 | 7842: Nov 08 20:37:13 centos-7-rax-iad-0000787869 os-collect-config[5169]: changed: [localhost] => (item={'key': u'step_2', 'value': {u'gnocchi_init_log': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R gnocchi:gnocchi /var/log/gnocchi'], u'user': u'root', u'volumes': [u'/var/log/containers/gnocchi:/var/log/gnocchi', u'/var/log/containers/httpd/gnocchi-api:/var/log/httpd']}, u'cinder_scheduler_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-cinder-scheduler:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R cinder:cinder /var/log/cinder'], u'privileged': False, u'volumes': [u'/var/log/containers/cinder:/var/log/cinder'], u'user': u'root'}, u'neutron_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R neutron:neutron /var/log/neutron'], u'privileged': False, u'volumes': [u'/var/log/containers/neutron:/var/log/neutron', u'/var/log/containers/httpd/neutron-api:/var/log/httpd'], u'user': u'root'}, u'nova_api_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R nova:nova /var/log/nova'], u'privileged': False, u'volumes': [u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd'], u'user': u'root'}, u'congress_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-congress-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R congress:congress /var/log/congress'], u'privileged': False, u'volumes': [u'/var/log/containers/congress:/var/log/congress'], u'user': u'root'}, u'clustercheck': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRAT

0.002 | 19051: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: g: hiera(): Looking up ntp::service_name in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::service_provider in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::stepout in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tinker in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos_minclock in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos_minsane in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos_floor in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos_ceiling in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::tos_cohort in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::udlc in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::udlc_stratum in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::ntpsigndsocket in JSON backend\",
0.002 | 19051: \"Debug: hiera(): Looking up ntp::authprov in JSON backend\",
0.002 | 19051: \"Debug: importing '/etc/puppet/modules/ntp/manifests/install.pp' in environment production\",
0.002 | 19051: \"Debug: Automatically imported ntp::install from ntp/install into production\",
0.002 | 19051: \"Debug: importing '/etc/puppet/modules/ntp/manifests/config.pp' in environment production\",
0.002 | 19051: \"Debug: Automatically imported ntp::config from ntp/config into production\",
0.002 | 19051: \"Debug: Scope(Class[Ntp::Config]): Retrieving template ntp/ntp.conf.erb\",
0.002 | 19051: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Bound template variables for /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.002 | 19051: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Interpolated template /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.002 | 19051: \"Debug: importing '/etc/puppet/modules/ntp/manifests/service.pp' in environment production\",
0.002 | 19051: \"Debug: Automatically imported ntp::service from ntp/service into production\",
0.002 | 19051: \"Debug: import
0.002 | 19052: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: ing '/etc/puppet/modules/tripleo/manifests/profile/base/snmp.pp' in environment production\",
0.002 | 19052: \"Debug: Automatically imported tripleo::profile::base::snmp from tripleo/profile/base/snmp into production\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_config in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_password in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_user in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::step in JSON backend\",
0.002 | 19052: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.002 | 19052: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.002 | 19052: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.002 | 19052: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.002 | 19052: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.002 | 19052: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.002 | 19052: \"Debug: Automatically imported ssh::params from ssh/params into production\",
0.002 | 19052: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.002 | 19052: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.005 | 19053: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]:
0.005 | 19053: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.005 | 19053: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.005 | 19053: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.005 | 19053: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.005 | 19053: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.005 | 19053: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.005 | 19053: \"Debug: Automatically imported concat from concat into production\",
0.005 | 19053: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.005 | 19053: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.005 | 19053: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.005 | 19053: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.005 | 19053: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/service.pp' in environment production\",
0.005 | 19053: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.005 | 19053: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.005 | 19053: \"Debug: hiera(): Looki
0.231 | 19054: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: ng up ssh::server::service::enable in JSON backend\",
0.231 | 19054: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.231 | 19054: \"Debug: Automatically imported timezone from timezone into production\",
0.231 | 19054: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.231 | 19054: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.231 | 19054: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.231 | 19054: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.231 | 19054: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.231 | 19054: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.231 | 19054: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.231 | 19054: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.231 | 19054: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.231 | 19054: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.231 | 19054: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.231 | 19054: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.231 | 19054: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.231 | 19054: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/profile/base/cinder into production\",
0.231 | 19054: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.231 | 19054: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.231 | 19054: \"Debug: hiera(): Looking up tripleo::profile::base
0.074 | 19055: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: ::cinder::step in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up messaging_notify_service_name in JSON backend\",
0.074 | 19055: \"Debug: hiera(): Looking up cinder::rabbit_use_ssl in JSON backend\",
0.074 | 19055: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production\",
0.074 | 19055: \"Debug: Auto

0.000 | 19200: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: ss\",
0.000 | 19200: \"2017-11-08 20:42:53,913 INFO: 15363 -- Finished processing puppet configs for congress\",
0.000 | 19200: \"2017-11-08 20:42:53,914 INFO: 15363 -- Starting configuration of ceilometer using image 192.168.24.1:8787/tripleomaster/centos-binary-ceilometer-central:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:42:53,914 INFO: 15363 -- Removing container: docker-puppet-ceilometer\",
0.000 | 19200: \"2017-11-08 20:42:53,944 INFO: 15363 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-ceilometer-central:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:42:56,462 INFO: 15361 -- Removing container: docker-puppet-heat_api_cfn\",
0.000 | 19200: \"2017-11-08 20:42:56,520 INFO: 15361 -- Finished processing puppet configs for heat_api_cfn\",
0.000 | 19200: \"2017-11-08 20:42:56,521 INFO: 15361 -- Starting configuration of haproxy using image 192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:42:56,521 INFO: 15361 -- Removing container: docker-puppet-haproxy\",
0.000 | 19200: \"2017-11-08 20:42:56,554 INFO: 15361 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:42:57,738 INFO: 15362 -- Removing container: docker-puppet-heat_api\",
0.000 | 19200: \"2017-11-08 20:42:57,793 INFO: 15362 -- Finished processing puppet configs for heat_api\",
0.000 | 19200: \"2017-11-08 20:42:57,793 INFO: 15362 -- Starting configuration of neutron using image 192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:42:57,794 INFO: 15362 -- Removing container: docker-puppet-neutron\",
0.000 | 19200: \"2017-11-08 20:42:57,821 INFO: 15362 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19200: \"2017-11-08 20:43:15,436 INFO: 15363 -- Removing container: do
0.000 | 19201: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: cker-puppet-ceilometer\",
0.000 | 19201: \"2017-11-08 20:43:15,477 INFO: 15363 -- Finished processing puppet configs for ceilometer\",
0.000 | 19201: \"2017-11-08 20:43:15,477 INFO: 15363 -- Starting configuration of rabbitmq using image 192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19201: \"2017-11-08 20:43:15,478 INFO: 15363 -- Removing container: docker-puppet-rabbitmq\",
0.000 | 19201: \"2017-11-08 20:43:15,507 INFO: 15363 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19201: \"2017-11-08 20:43:20,586 INFO: 15361 -- Removing container: docker-puppet-haproxy\",
0.000 | 19201: \"2017-11-08 20:43:20,629 INFO: 15361 -- Finished processing puppet configs for haproxy\",
0.000 | 19201: \"2017-11-08 20:43:25,073 INFO: 15362 -- Removing container: docker-puppet-neutron\",
0.000 | 19201: \"2017-11-08 20:43:25,111 INFO: 15362 -- Finished processing puppet configs for neutron\",
0.000 | 19201: \"2017-11-08 20:43:25,111 INFO: 15362 -- Starting configuration of cinder using image 192.168.24.1:8787/tripleomaster/centos-binary-cinder-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19201: \"2017-11-08 20:43:25,112 INFO: 15362 -- Removing container: docker-puppet-cinder\",
0.000 | 19201: \"2017-11-08 20:43:25,138 INFO: 15362 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-cinder-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 19201: \"2017-11-08 20:43:44,298 INFO: 15363 -- Removing container: docker-puppet-rabbitmq\",
0.000 | 19201: \"2017-11-08 20:43:44,338 INFO: 15363 -- Finished processing puppet configs for rabbitmq\",
0.000 | 19201: \"2017-11-08 20:44:01,362 INFO: 15362 -- Removing container: docker-puppet-cinder\",
0.000 | 19201: \"2017-11-08 20:44:01,414 INFO: 15362 -- Finished processing puppet configs for cinder\"
0.000 | 19201: ],
0.000 | 19201: \"failed_when_result\": false
0.000 | 19201: }
0.000 | 19201:
0.000 | 19201: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-startup-config-step_1.json exists] ***
0.000 | 19201: ok: [localhost]
0.000 | 19201:
0.000 | 19201: TASK
0.000 | 19202: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: [Start containers for step 1] *********************************************
0.000 | 19202: ok: [localhost]
0.000 | 19202:
0.000 | 19202: TASK [debug] *******************************************************************
0.000 | 19202: ok: [localhost] => {
0.000 | 19202: \"(outputs.stderr|default('')).split('\
0.000 | 19202: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 19202: \"stdout: \",
0.000 | 19202: \"stderr: \",
0.000 | 19202: \"stdout: 4f06c7de1a596d6b7a8450aa433baa20188c9b81a7813320fbb4c2225052e14d\",
0.000 | 19202: \"\",
0.000 | 19202: \"stdout: Installing MariaDB/MySQL system tables in '/var/lib/mysql' ...\",
0.000 | 19202: \"OK\",
0.000 | 19202: \"Filling help tables...\",
0.000 | 19202: \"Creating OpenGIS required SP-s...\",
0.000 | 19202: \"To start mysqld at boot time you have to copy\",
0.000 | 19202: \"support-files/mysql.server to the right place for your system\",
0.000 | 19202: \"PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !\",
0.000 | 19202: \"To do so, start the server, then issue the following commands:\",
0.000 | 19202: \"'/usr/bin/mysqladmin' -u root password 'new-password'\",
0.000 | 19202: \"'/usr/bin/mysqladmin' -u root -h centos-7-rax-iad-0000787869 password 'new-password'\",
0.000 | 19202: \"Alternatively you can run:\",
0.000 | 19202: \"'/usr/bin/mysql_secure_installation'\",
0.000 | 19202: \"which will also give you the option of removing the test\",
0.000 | 19202: \"databases and anonymous user created by default. This is\",
0.000 | 19202: \"strongly recommended for production servers.\",
0.000 | 19202: \"See the MariaDB Knowledgebase at http://mariadb.com/kb or the\",
0.000 | 19202: \"MySQL manual for more instructions.\",
0.000 | 19202: \"You can start the MariaDB daemon with:\",
0.000 | 19202: \"cd '/usr' ; /usr/bin/mysqld_safe --datadir='/var/lib/mysql'\",
0.000 | 19202: \"You can test the MariaDB daemon with mysql-test-run.pl\",
0.000 | 19202: \"cd '/usr/mysql-test' ; perl mysql-test-run.pl\",
0.000 | 19202: \"Please report any problems at http://mariadb.org/jira\",
0.000 | 19202: \"The latest information about MariaDB is available at http://mariadb.org/.\",
0.000 | 19202: \"You can find additional information about the MySQL part at:\",
0.000 | 19202: \"http://dev.mysql.c
0.431 | 19203: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: om\",
0.431 | 19203: \"Consider joining MariaDB's strong and vibrant community:\",
0.431 | 19203: \"https://mariadb.org/get-involved/\",
0.431 | 19203: \"171108 20:44:18 mysqld_safe Logging to '/var/log/mariadb/mariadb.log'.\",
0.431 | 19203: \"171108 20:44:18 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql\",
0.431 | 19203: \"spawn mysql_secure_installation\\r\",
0.431 | 19203: \"\\r\",
0.431 | 19203: \"NOTE: RUNNING ALL PARTS OF THIS SCRIPT IS RECOMMENDED FOR ALL MariaDB\\r\",
0.431 | 19203: \" SERVERS IN PRODUCTION USE! PLEASE READ EACH STEP CAREFULLY!\\r\",
0.431 | 19203: \"In order to log into MariaDB to secure it, we'll need the current\\r\",
0.431 | 19203: \"password for the root user. If you've just installed MariaDB, and\\r\",
0.431 | 19203: \"you haven't set the root password yet, the password will be blank,\\r\",
0.431 | 19203: \"so you should just press enter here.\\r\",
0.431 | 19203: \"Enter current password for root (enter for none): \\r\",
0.431 | 19203: \"OK, successfully used password, moving on...\\r\",
0.431 | 19203: \"Setting the root password ensures that nobody can log into the MariaDB\\r\",
0.431 | 19203: \"root user without the proper authorisation.\\r\",
0.431 | 19203: \"Set root password? [Y/n] y\\r\",
0.431 | 19203: \"New password: \\r\",
0.431 | 19203: \"Re-enter new password: \\r\",
0.431 | 19203: \"Password updated successfully!\\r\",
0.431 | 19203: \"Reloading privilege tables..\\r\",
0.431 | 19203: \" ... Success!\\r\",
0.431 | 19203: \"By default, a MariaDB installation has an anonymous user, allowing anyone\\r\",
0.431 | 19203: \"to log into MariaDB without having to have a user account created for\\r\",
0.431 | 19203: \"them. This is intended only for testing, and to make the installation\\r\",
0.431 | 19203: \"go a bit smoother. You should remove them before moving into a\\r\",
0.431 | 19203: \"production environment.\\r\",
0.431 | 19203: \"Remove anonymous users? [Y/n] y\\r\",
0.431 | 19203: \"Normally, root should only be allowed to connect from 'localhost'. This\\r\",
0.431 | 19203: \"ensures that someone cannot guess at the root password from the network.\\r\",
0.431 | 19203: \"Disallow root l
0.464 | 19204: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: ogin remotely? [Y/n] n\\r\",
0.464 | 19204: \" ... skipping.\\r\",
0.464 | 19204: \"By default, MariaDB comes with a database named 'test' that anyone can\\r\",
0.464 | 19204: \"access. This is also intended only for testing, and should be removed\\r\",
0.464 | 19204: \"before moving into a production environment.\\r\",
0.464 | 19204: \"Remove test database and access to it? [Y/n] y\\r\",
0.464 | 19204: \" - Dropping test database...\\r\",
0.464 | 19204: \" - Removing privileges on test database...\\r\",
0.464 | 19204: \"Reloading the privilege tables will ensure that all changes made so far\\r\",
0.464 | 19204: \"will take effect immediately.\\r\",
0.464 | 19204: \"Reload privilege tables now? [Y/n] y\\r\",
0.464 | 19204: \"Cleaning up...\\r\",
0.464 | 19204: \"All done! If you've completed all of the above steps, your MariaDB\\r\",
0.464 | 19204: \"installation should now be secure.\\r\",
0.464 | 19204: \"Thanks for using MariaDB!\\r\",
0.464 | 19204: \"171108 20:44:21 mysqld_safe mysqld from pid file /var/lib/mysql/mariadb.pid ended\",
0.464 | 19204: \"171108 20:44:22 mysqld_safe Logging to '/var/log/mariadb/mariadb.log'.\",
0.464 | 19204: \"171108 20:44:22 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql\",
0.464 | 19204: \"mysqld is alive\",
0.464 | 19204: \"171108 20:44:25 mysqld_safe mysqld from pid file /var/lib/mysql/mariadb.pid ended\",
0.464 | 19204: \"stderr: + '[' -e /var/lib/mysql/mysql ']'\",
0.464 | 19204: \"+ echo -e '\\\
0.464 | 19204: [mysqld]\\\
0.464 | 19204: wsrep_provider=none'\",
0.464 | 19204: \"+ sudo -u mysql -E kolla_start\",
0.464 | 19204: \"+ sudo -E kolla_set_configs\",
0.464 | 19204: \"INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json\",
0.464 | 19204: \"INFO:__main__:Validating config file\",
0.464 | 19204: \"INFO:__main__:Kolla config strategy set to: COPY_ALWAYS\",
0.464 | 19204: \"INFO:__main__:Copying service configuration files\",
0.464 | 19204: \"INFO:__main__:Copying /dev/null to /etc/libqb/force-filesystem-sockets\",
0.464 | 19204: \"INFO:__main__:Setting permission for /etc/libqb/force-filesystem-sockets\",
0.464 | 19204: \"INFO:__main__:Deleting /etc/my.cnf.d/galera.cnf\",
0.464 | 19204: \"INFO:__main
0.185 | 19205: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: __:Copying /var/lib/kolla/config_files/src/etc/my.cnf.d/galera.cnf to /etc/my.cnf.d/galera.cnf\",
0.185 | 19205: \"INFO:__main__:Copying /var/lib/kolla/config_files/src/etc/sysconfig/clustercheck to /etc/sysconfig/clustercheck\",
0.185 | 19205: \"INFO:__main__:Copying /var/lib/kolla/config_files/src/root/.my.cnf to /root/.my.cnf\",
0.185 | 19205: \"INFO:__main__:Writing out command to execute\",
0.185 | 19205: \"++ cat /run_command\",
0.185 | 19205: \"+ CMD=/usr/sbin/pacemaker_remoted\",
0.185 | 19205: \"+ ARGS=\",
0.185 | 19205: \"+ [[ ! -n '' ]]\",
0.185 | 19205: \"+ . kolla_extend_start\",
0.185 | 19205: \"++ [[ ! -d /var/log/kolla/mariadb ]]\",
0.185 | 19205: \"++ mkdir -p /var/log/kolla/mariadb\",
0.185 | 19205: \"+++ stat -c %a /var/log/kolla/mariadb\",
0.185 | 19205: \"++ [[ 2755 != \\\\7\\\\5\\\\5 ]]\",
0.185 | 19205: \"++ chmod 755 /var/log/kolla/mariadb\",
0.185 | 19205: \"++ [[ -n 0 ]]\",
0.185 | 19205: \"++ mysql_install_db\",
0.185 | 19205: \"2017-11-08 20:44:05 140530918357184 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.185 | 19205: \"2017-11-08 20:44:05 140530918357184 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 46 ...\",
0.185 | 19205: \"2017-11-08 20:44:10 140217886963904 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.185 | 19205: \"2017-11-08 20:44:10 140217886963904 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 75 ...\",
0.185 | 19205: \"2017-11-08 20:44:14 140639425128640 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.185 | 19205: \"2017-11-08 20:44:14 140639425128640 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 105 ...\",
0.185 | 19205: \"++ bootstrap_db\",
0.185 | 19205: \"++ TIMEOUT=60\",
0.185 | 19205: \"++ mysqld_safe --wsrep-new-cluster --skip-networking --wsrep-on=OFF --pid-file=/var/lib/mysql/mariadb.pid\",
0.185 | 19205: \"++ [[ ! -S /var/lib/mysql/mysql.sock ]]\",
0.185 | 19205: \"++ [[ ! -S /var/run/mysqld/mysqld.sock ]]\",
0.185 | 19205: \"++ [[ 60 -gt 0 ]]\",
0.185 | 19205: \"

0.068 | 19759: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::rabbit_use_ssl in JSON backend",
0.115 | 19760: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production",
0.143 | 19761: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported tripleo::profile::pacemaker::cinder::backup from tripleo/profile/pacemaker/cinder/backup into production",
0.220 | 19762: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::backup::bootstrap_node in JSON backend",
0.135 | 19763: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::backup::step in JSON backend",

0.064 | 19765: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder_backup_short_bootstrap_node_name in JSON backend",
0.110 | 19766: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/volume.pp' in environment production",
0.132 | 19767: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported tripleo::profile::pacemaker::cinder::volume from tripleo/profile/pacemaker/cinder/volume into production",
0.225 | 19768: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::bootstrap_node in JSON backend",
0.128 | 19769: Nov 08 20:44:28 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::step in JSON backend",

0.000 | 25287: Nov 08 20:48:30 centos-7-rax-iad-0000787869 pengine[12541]: notice: Calculated transition 3, saving inputs in /var/lib/pacemaker/pengine/pe-input-3.bz2
0.000 | 25288: Nov 08 20:48:30 centos-7-rax-iad-0000787869 crmd[12542]: notice: Transition 3 (Complete=0, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): Complete
0.000 | 25289: Nov 08 20:48:30 centos-7-rax-iad-0000787869 crmd[12542]: notice: State transition S_TRANSITION_ENGINE -> S_IDLE
0.207 | 25290: Nov 08 20:48:30 centos-7-rax-iad-0000787869 puppet-user[43039]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]/ensure) created
0.000 | 25291: Nov 08 20:48:33 centos-7-rax-iad-0000787869 crmd[12542]: notice: State transition S_IDLE -> S_POLICY_ENGINE

0.028 | 29760: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: property create: property set --node centos-7-rax-iad-0000787869 cinder-backup-role=true -> ",
0.191 | 29761: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]/ensure: created",
0.165 | 29762: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]: The container Pacemaker::Property[cinder-backup-role-node-property] will propagate my refresh event",
0.228 | 29763: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Pacemaker::Property[cinder-backup-role-node-property]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Backup] will propagate my refresh event",
0.261 | 29764: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Backup]: The container Stage[main] will propagate my refresh event",
0.000 | 29765: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-43039-13u66pr returned ",

0.027 | 29771: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: property create: property set --node centos-7-rax-iad-0000787869 cinder-volume-role=true -> ",
0.175 | 29772: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Property[cinder-volume-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-volume-role]/ensure: created",
0.154 | 29773: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Property[cinder-volume-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-volume-role]: The container Pacemaker::Property[cinder-volume-role-node-property] will propagate my refresh event",
0.211 | 29774: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Pacemaker::Property[cinder-volume-role-node-property]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Volume] will propagate my refresh event",
0.243 | 29775: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Volume]: The container Stage[main] will propagate my refresh event",
0.000 | 29776: Nov 08 20:53:19 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Prefetching iptables resources for firewall",

0.000 | 33125: Nov 08 20:54:36 centos-7-rax-iad-0000787869 redis(redis)[67280]: DEBUG: redis_client: '/usr/bin/redis-cli' -s '/var/run/redis/redis.sock' info
0.000 | 33126: Nov 08 20:54:36 centos-7-rax-iad-0000787869 yum[66409]: Installed: python2-pyasn1-modules-0.1.9-7.el7.noarch
0.000 | 33127: Nov 08 20:54:36 centos-7-rax-iad-0000787869 pacemaker_remoted[46554]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)
0.206 | 33128: Nov 08 20:54:36 centos-7-rax-iad-0000787869 yum[66409]: Installed: python-kmod-0.9-4.el7.x86_64
0.000 | 33129: Nov 08 20:54:36 centos-7-rax-iad-0000787869 pacemaker_remoted[56234]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)

0.092 | 33182: Nov 08 20:54:42 centos-7-rax-iad-0000787869 groupadd[67502]: new group: name=cinder, GID=165
0.130 | 33183: Nov 08 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: new user: name=cinder, UID=165, GID=165, home=/var/lib/cinder, shell=/sbin/nologin
0.055 | 33184: Nov 08 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to group 'nobody'
0.303 | 33185: Nov 08 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to group 'cinder'
0.040 | 33186: Nov 08 20:54:42 centos-7-rax-iad-0000787869 useradd[67507]: add 'cinder' to shadow group 'nobody'

0.000 | 33206: Nov 08 20:54:42 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 33207: Nov 08 20:54:42 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 33208: Nov 08 20:54:42 centos-7-rax-iad-0000787869 yum[66409]: Installed: 1:openstack-cinder-12.0.0-0.20171107135501.fb27334.el7.centos.noarch
0.333 | 33209: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]/ensure) created
0.240 | 33210: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[cinder::install::end]
0.261 | 33211: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[cinder::service::end]
0.266 | 33212: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[keystone::service::end]
0.000 | 33213: Nov 08 20:54:43 centos-7-rax-iad-0000787869 systemd-journal[353]: Suppressed 12713 messages from /system.slice/os-collect-config.service
0.331 | 33214: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) The container Class[Cinder] will propagate my refresh event
0.158 | 33215: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::install::end]) Triggered 'refresh' from 1 events
0.179 | 33216: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::install::end]) Scheduling refresh of Anchor[cinder::service::begin]
0.274 | 33217: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::install::end]) The container Class[Cinder::Deps] will propagate my refresh event
0.359 | 33218: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]) Nothing to manage: no ensure and the resource doesn't exist
0.359 | 33219: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33220: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure) created
0.227 | 33221: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]) Scheduling refresh of Anchor[cinder::config::end]
0.267 | 33222: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]) The container Class[Cinder] will propagate my refresh event
0.000 | 33223: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure) created
0.225 | 33224: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]) Scheduling refresh of Anchor[cinder::config::end]
0.265 | 33225: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]) The container Class[Cinder] will propagate my refresh event
0.000 | 33226: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure) created
0.225 | 33227: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]) Scheduling refresh of Anchor[cinder::config::end]
0.265 | 33228: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]) The container Class[Cinder] will propagate my refresh event
0.366 | 33229: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]) Nothing to manage: no ensure and the resource doesn't exist
0.366 | 33230: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33231: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/host]/ensure) created
0.276 | 33232: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/host]) Scheduling refresh of Anchor[cinder::config::end]
0.313 | 33233: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/host]) The container Class[Cinder] will propagate my refresh event
0.354 | 33234: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33235: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure) created
0.184 | 33236: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]) Scheduling refresh of Anchor[cinder::config::end]
0.224 | 33237: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]) The container Class[Cinder] will propagate my refresh event
0.000 | 33238: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure) created
0.198 | 33239: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]) Scheduling refresh of Anchor[cinder::config::end]
0.217 | 33240: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]) The container Class[Cinder::Glance] will propagate my refresh event
0.000 | 33241: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure) created
0.198 | 33242: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]) Scheduling refresh of Anchor[cinder::config::end]
0.217 | 33243: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]) The container Class[Cinder::Glance] will propagate my refresh event
0.382 | 33244: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]) Nothing to manage: no ensure and the resource doesn't exist
0.383 | 33245: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]) Nothing to manage: no ensure and the resource doesn't exist
0.380 | 33246: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]) Nothing to manage: no ensure and the resource doesn't exist
0.379 | 33247: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]) Nothing to manage: no ensure and the resource doesn't exist
0.212 | 33248: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder::Glance]) The container Stage[main] will propagate my refresh event
0.037 | 33249: Nov 08 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: Executing: '/usr/bin/rpm -q openstack-tacker --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}
0.037 | 33249: '

0.000 | 33288: Nov 08 20:54:48 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Tacker::Server/Package[tacker-server]) Scheduling refresh of Anchor[keystone::service::end]
0.000 | 33289: Nov 08 20:54:48 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Tacker::Server/Package[tacker-server]) The container Class[Tacker::Server] will propagate my refresh event
0.172 | 33290: Nov 08 20:54:48 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::end]) Triggered 'refresh' from 2 events
0.288 | 33291: Nov 08 20:54:48 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::end]) The container Class[Cinder::Deps] will propagate my refresh event
0.000 | 33292: Nov 08 20:54:48 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Tacker::Deps/Anchor[tacker::install::end]) Triggered 'refresh' from 1 events

0.000 | 33326: Nov 08 20:54:50 centos-7-rax-iad-0000787869 puppet-user[64795]: Executing: '/usr/bin/rpm -q MySQL-python --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}
0.000 | 33326: '
0.000 | 33327: Nov 08 20:54:50 centos-7-rax-iad-0000787869 pacemaker_remoted[46554]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)
0.190 | 33328: Nov 08 20:54:50 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]) Nothing to manage: no ensure and the resource doesn't exist
0.210 | 33329: Nov 08 20:54:50 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33330: Nov 08 20:54:50 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]/ensure) created

0.179 | 33353: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]) Scheduling refresh of Anchor[cinder::config::end]
0.193 | 33354: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]) The container Oslo::Db[cinder_config] will propagate my refresh event
0.189 | 33355: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]) Nothing to manage: no ensure and the resource doesn't exist
0.264 | 33356: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Db[cinder_config]) The container Class[Cinder::Db] will propagate my refresh event
0.128 | 33357: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder::Db]) The container Stage[main] will propagate my refresh event
0.000 | 33358: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]/ensure) created
0.198 | 33359: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]) Scheduling refresh of Anchor[cinder::config::end]
0.207 | 33360: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]) The container Oslo::Log[cinder_config] will propagate my refresh event
0.191 | 33361: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]) Nothing to manage: no ensure and the resource doesn't exist
0.191 | 33362: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]) Nothing to manage: no ensure and the resource doesn't exist
0.284 | 33363: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33364: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]/ensure) created

0.191 | 33379: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]) Nothing to manage: no ensure and the resource doesn't exist
0.191 | 33380: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]) Nothing to manage: no ensure and the resource doesn't exist
0.191 | 33381: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]) Nothing to manage: no ensure and the resource doesn't exist
0.226 | 33382: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Log[cinder_config]) The container Class[Cinder::Logging] will propagate my refresh event
0.117 | 33383: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder::Logging]) The container Stage[main] will propagate my refresh event

0.182 | 33417: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]) Nothing to manage: no ensure and the resource doesn't exist
0.182 | 33418: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_key_file]) Nothing to manage: no ensure and the resource doesn't exist
0.179 | 33419: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]) Nothing to manage: no ensure and the resource doesn't exist
0.207 | 33420: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Messaging::Rabbit[cinder_config]) The container Class[Cinder] will propagate my refresh event
0.176 | 33421: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]) Nothing to manage: no ensure and the resource doesn't exist
0.176 | 33422: Nov 08 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/server_request_prefix]) Nothing to manage: no ensure and the resource doesn't exist
0.201 | 33423: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]) Nothing to manage: no ensure and the resource doesn't exist
0.176 | 33424: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]) Nothing to manage: no ensure and the resource doesn't exist

0.161 | 33455: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]) Scheduling refresh of Anchor[cinder::config::end]
0.157 | 33456: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]) The container Oslo::Messaging::Default[cinder_config] will propagate my refresh event
0.196 | 33457: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Messaging::Default[cinder_config]) The container Class[Cinder] will propagate my refresh event
0.296 | 33458: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 33459: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]/ensure) created
0.151 | 33460: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]) Scheduling refresh of Anchor[cinder::config::end]
0.156 | 33461: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]) The container Oslo::Concurrency[cinder_config] will propagate my refresh event
0.354 | 33462: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Concurrency[cinder_config]) The container Class[Cinder] will propagate my refresh event
0.000 | 33463: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder]) The container Stage[main] will propagate my refresh event

0.000 | 33467: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/ensure) created
0.143 | 33468: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]) Scheduling refresh of Anchor[cinder::config::end]
0.151 | 33469: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]) The container Oslo::Messaging::Notifications[cinder_config] will propagate my refresh event
0.207 | 33470: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]) Nothing to manage: no ensure and the resource doesn't exist
0.177 | 33471: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::config::end]) Triggered 'refresh' from 22 events
0.195 | 33472: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::config::end]) Scheduling refresh of Anchor[cinder::service::begin]
0.288 | 33473: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::config::end]) The container Class[Cinder::Deps] will propagate my refresh event
0.256 | 33474: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Messaging::Notifications[cinder_config]) The container Class[Cinder::Ceilometer] will propagate my refresh event
0.221 | 33475: Nov 08 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder::Ceilometer]) The container Stage[main] will propagate my refresh event
0.000 | 33476: Nov 08 20:54:53 centos-7-rax-iad-0000787869 puppet-user[64795]: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-q2s22q returned

0.000 | 33665: Nov 08 20:54:55 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Tacker::Deps/Anchor[tacker::config::end]) The container Class[Tacker::Deps] will propagate my refresh event
0.000 | 33666: Nov 08 20:54:55 centos-7-rax-iad-0000787869 puppet-user[64795]: (Oslo::Db[tacker_config]) The container Class[Tacker::Db] will propagate my refresh event
0.158 | 33667: Nov 08 20:54:55 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Triggered 'refresh' from 2 events
0.274 | 33668: Nov 08 20:54:55 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) The container Class[Cinder::Deps] will propagate my refresh event
0.122 | 33669: Nov 08 20:54:55 centos-7-rax-iad-0000787869 puppet-user[64795]: (Class[Cinder::Deps]) The container Stage[main] will propagate my refresh event

0.025 | 37661: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: hiera(): Looking up ntp::servers in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::service_enable in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::service_ensure in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::service_manage in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::service_name in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::service_provider in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::stepout in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tinker in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos_minclock in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos_minsane in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos_floor in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos_ceiling in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::tos_cohort in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::udlc in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::udlc_stratum in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::ntpsigndsocket in JSON backend\",
0.025 | 37661: \"Debug: hiera(): Looking up ntp::authprov in JSON backend\",
0.025 | 37661: \"Debug: importing '/etc/puppet/modules/ntp/manifests/install.pp' in environment production\",
0.025 | 37661: \"Debug: Automatically imported ntp::install from ntp/install into production\",
0.025 | 37661: \"Debug: importing '/etc/puppet/modules/ntp/manifests/config.pp' in environment production\",
0.025 | 37661: \"Debug: Automatically imported ntp::config from ntp/config into production\",
0.025 | 37661: \"Debug: Scope(Class[Ntp::Config]): Retrieving template ntp/ntp.conf.erb\",
0.025 | 37661: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Bound template variables for /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.025 | 37661: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.e
0.025 | 37662: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: rb]: Interpolated template /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/ntp/manifests/service.pp' in environment production\",
0.025 | 37662: \"Debug: Automatically imported ntp::service from ntp/service into production\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/snmp.pp' in environment production\",
0.025 | 37662: \"Debug: Automatically imported tripleo::profile::base::snmp from tripleo/profile/base/snmp into production\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_config in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_password in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_user in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::step in JSON backend\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.025 | 37662: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.025 | 37662: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.025 | 37662: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.025 | 37662: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.025 | 37662: \"Debug: Automa
0.021 | 37663: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: tically imported ssh::params from ssh/params into production\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.021 | 37663: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.021 | 37663: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.021 | 37663: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.021 | 37663: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.021 | 37663: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.021 | 37663: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.021 | 37663: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.021 | 37663: \"Debug: Automatically imported concat from concat into production\",
0.021 | 37663: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.021 | 37663: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.021 | 37663: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.021 | 37663: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.021 | 37663: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.021 | 37663: \"Debug: impor
0.208 | 37664: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ting '/etc/puppet/modules/ssh/manifests/server/service.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.208 | 37664: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.208 | 37664: \"Debug: hiera(): Looking up ssh::server::service::enable in JSON backend\",
0.208 | 37664: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported timezone from timezone into production\",
0.208 | 37664: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.208 | 37664: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.208 | 37664: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.208 | 37664: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.208 | 37664: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.208 | 37664: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.208 | 37664: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.208 | 37664: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.208 | 37664: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.208 | 37664: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.208 | 37664: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/prof
0.061 | 37665: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ile/base/cinder into production\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::step in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.061 | 37665: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.061 | 37665: \

0.002 | 37838: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: and 'set' with params [\\\"/files/etc/sysconfig/docker/OPTIONS\\\", \\\"\\\\\\\"--log-driver=journald --signature-verification=false --iptables=false\\\\\\\"\\\"]\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-options](provider=augeas): Skipping because no files were changed\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-options](provider=augeas): Closed the augeas connection\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Augeas version 1.4.0 is installed\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Will attempt to save and only run if files changed\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): sending command 'set' with params [\\\"/files/etc/sysconfig/docker/INSECURE_REGISTRY\\\", \\\"\\\\\\\"--insecure-registry 192.168.24.1:8787\\\\\\\"\\\"]\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Skipping because no files were changed\",
0.002 | 37838: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Closed the augeas connection\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): Augeas version 1.4.0 is installed\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): Will attempt to save and only run if files changed\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'rm' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\\\\\"registry-mirrors\\\\\\\"]\\\"]\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'set' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\\\\\"debug\\\\\\\"]\\\", \\\"debug\\\"]\",
0.002 | 37838: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'set' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\
0.007 | 37839: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: \\\\"debug\\\\\\\"]/const\\\", \\\"false\\\"]\",
0.007 | 37839: \"Debug: Augeas[docker-daemon.json](provider=augeas): Skipping because no files were changed\",
0.007 | 37839: \"Debug: Augeas[docker-daemon.json](provider=augeas): Closed the augeas connection\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Augeas version 1.4.0 is installed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Will attempt to save and only run if files changed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): sending command 'set' with params [\\\"/files/etc/sysconfig/docker-storage/DOCKER_STORAGE_OPTIONS\\\", \\\"\\\\\\\" -s overlay2\\\\\\\"\\\"]\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Skipping because no files were changed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Closed the augeas connection\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Augeas version 1.4.0 is installed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Will attempt to save and only run if files changed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): sending command 'rm' with params [\\\"/files/etc/sysconfig/docker-network/DOCKER_NETWORK_OPTIONS\\\"]\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Skipping because no files were changed\",
0.007 | 37839: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Closed the augeas connection\",
0.007 | 37839: \"Debug: Executing: '/usr/bin/systemctl is-active docker'\",
0.007 | 37839: \"Debug: Executing: '/usr/bin/systemctl is-enabled docker'\",
0.007 | 37839: \"Debug: Exec[directory-create-etc-my.cnf.d](provider=posix): Executing check 'test -d /etc/my.cnf.d'\",
0.007 | 37839: \"Debu
0.008 | 37840: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: g: Executing: 'test -d /etc/my.cnf.d'\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Augeas version 1.4.0 is installed\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Will attempt to save and only run if files changed\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'set' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/bind-address\\\", \\\"192.168.24.15\\\"]\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl\\\"]\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl-ca\\\"]\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Skipping because no files were changed\",
0.008 | 37840: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Closed the augeas connection\",
0.008 | 37840: \"Debug: Executing: '/usr/bin/systemctl is-active pcsd'\",
0.008 | 37840: \"Debug: Executing: '/usr/bin/systemctl is-enabled pcsd'\",
0.008 | 37840: \"Debug: Exec[Create Cluster tripleo_cluster](provider=posix): Executing check '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.008 | 37840: \"Debug: Executing: '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.008 | 37840: \"Debug: Exec[Start Cluster tripleo_cluster](provider=posix): Executing check '/sbin/pcs status >/dev/null 2>&1'\",
0.008 | 37840: \"Debug: Executing: '/sbin/pcs status >/dev/null 2>&1'\",
0.008 | 37840: \"Debug: Executing: '/usr/bin/systemctl is-enabled corosync'\",
0.008 | 37840: \"Debug: Executing: '/usr/bin/systemctl is-enabled pacemaker'\",
0.008 | 37840: \"Debug: Exec[wait-for-settle](provider=posix): Executing check '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.008 | 37840: \"Debug: Executing: '/sbin/pcs status | grep -q 'partition
0.273 | 37841: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: with quorum' > /dev/null 2>&1'\",
0.273 | 37841: \"Debug: defaults exists resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.273 | 37841: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1rl6m0z returned \",
0.273 | 37841: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1rl6m0z resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/systemctl is-active chronyd'\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/systemctl is-enabled chronyd'\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/systemctl is-active ntpd'\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/systemctl is-enabled ntpd'\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/rpm -q openstack-cinder --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.273 | 37841: '\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/rpm -q openstack-cinder --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.273 | 37841: --whatprovides'\",
0.273 | 37841: \"Debug: Package[cinder](provider=yum): Ensuring => present\",
0.273 | 37841: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install openstack-cinder'\",
0.273 | 37841: \"Notice: /Stage[main]/Cinder/Package[cinder]/ensure: created\",
0.273 | 37841: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[cinder::install::end]\",
0.273 | 37841: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[cinder::service::end]\",
0.273 | 37841: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[keystone::service::end]\",
0.273 | 37841: \"Debug: /Stage[main]/Cinder/Package[cinder]: The container Class[Cinder] will propagate my refresh event\",
0.273 | 37841: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::install::end]: Triggered 'refresh' from 1 events\",
0.273 | 37841: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::install::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.273 | 37841: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::install:
0.254 | 37842: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: :end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.254 | 37842: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure: created\",
0.254 | 37842: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Anchor[cinder::config::end]\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: The container Class[Cinder] will propagate my refresh event\",
0.254 | 37842: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure: created\",
0.254 | 37842: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Anchor[cinder::config::end]\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: The container Class[Cinder] will propagate my refresh event\",
0.254 | 37842: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure: created\",
0.254 | 37842: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Anchor[cinder::config::end]\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: The container Class[Cinder] will propagate my refresh event\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.254 | 37842: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.254 | 37842: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]/ensure: created\",
0.254 | 37842: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]: Scheduling refresh of Anchor[cinder::config::end]\
0.252 | 37843: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ",
0.252 | 37843: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]: The container Class[Cinder] will propagate my refresh event\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.252 | 37843: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure: created\",
0.252 | 37843: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Anchor[cinder::config::end]\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: The container Class[Cinder] will propagate my refresh event\",
0.252 | 37843: \"Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure: created\",
0.252 | 37843: \"Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Anchor[cinder::config::end]\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: The container Class[Cinder::Glance] will propagate my refresh event\",
0.252 | 37843: \"Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure: created\",
0.252 | 37843: \"Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Anchor[cinder::config::end]\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: The container Class[Cinder::Glance] will propagate my refresh event\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.252 | 37843: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]: Nothing to manage: no en
0.193 | 37844: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: sure and the resource doesn't exist\",
0.193 | 37844: \"Debug: Class[Cinder::Glance]: The container Stage[main] will propagate my refresh event\",
0.193 | 37844: \"Debug: Executing: '/usr/bin/rpm -q openstack-tacker --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.193 | 37844: '\",
0.193 | 37844: \"Debug: Executing: '/usr/bin/rpm -q openstack-tacker --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.193 | 37844: --whatprovides'\",
0.193 | 37844: \"Debug: Package[tacker-server](provider=yum): Ensuring => present\",
0.193 | 37844: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install openstack-tacker'\",
0.193 | 37844: \"Notice: /Stage[main]/Tacker::Server/Package[tacker-server]/ensure: created\",
0.193 | 37844: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[cinder::service::end]\",
0.193 | 37844: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[tacker::install::end]\",
0.193 | 37844: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[keystone::service::end]\",
0.193 | 37844: \"Debug: /Stage[main]/Tacker::Server/Package[tacker-server]: The container Class[Tacker::Server] will propagate my refresh event\",
0.193 | 37844: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: Triggered 'refresh' from 2 events\",
0.193 | 37844: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.193 | 37844: \"Notice: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Triggered 'refresh' from 1 events\",
0.193 | 37844: \"Info: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Scheduling refresh of Anchor[tacker::service::begin]\",
0.193 | 37844: \"Info: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Scheduling refresh of Exec[tacker-db-sync]\",
0.193 | 37844: \"Debug: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: The container Class[Tacker::Deps] will propagate my refresh event\",
0.193 | 37844: \"Notice: /Stag
0.032 | 37845: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: e[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]/ensure: created\",
0.032 | 37845: \"Info: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]: Scheduling refresh of Anchor[tacker::config::end]\",
0.032 | 37845: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]: The container Class[Tacker::Server] will propagate my refresh event\",
0.032 | 37845: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-enabled iptables'\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.032 | 37845: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.032 | 37845: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.032 | 37845: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.032 | 37845: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.032 | 37845: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.032 | 37845: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.032 | 37845: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.032 | 37845: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.032 | 37845: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.032 | 37845: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tr
0.351 | 37846: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ipleo::Profile::Base::Kernel] will propagate my refresh event\",
0.351 | 37846: \"Debug: Prefetching parsed resources for sysctl\",
0.351 | 37846: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.351 | 37846: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.351 | 37846: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.351 | 37846: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1w9knj2 returned \",
0.351 | 37846: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1w9knj2 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.351 | 37846: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/null 2>&1 -> \",
0.351 | 37846: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.351 | 37846: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.351 | 37846: \"Debug: Executing: '/usr/bin/rpm -q MySQL-python --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.351 | 37846: '\",
0.351 | 37846: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.351 | 37846: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.351 | 37846: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]/ensure: created\",
0.351 | 37846: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]: Scheduling refresh of Anchor[cinder::config::end]\",
0.351 | 37846: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.351 | 37846: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exis
0.244 | 37847: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: t\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]/ensure: created\",
0.244 | 37847: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]: Scheduling refresh of Anchor[cinder::config::end]\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.244 | 37847: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_co
0.225 | 37848: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: nfig]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.225 | 37848: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]/ensure: created\",
0.225 | 37848: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]: Scheduling refresh of Anchor[cinder::config::end]\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.225 | 37848: \"Debug: Oslo::Db[cinder_config]: The container Class[Cinder::Db] will propagate my refresh event\",
0.225 | 37848: \"Debug: Class[Cinder::Db]: The container Stage[main] will propagate my refresh event\",
0.225 | 37848: \"Notice: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]/ensure: created\",
0.225 | 37848: \"Info: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]: Scheduling refresh of Anchor[cinder::config::end]\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]: The container Oslo::Log[cinder_config] will propagate my refresh event\",
0.225 | 37848: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: N
0.241 | 37849: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: othing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Notice: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]/ensure: created\",
0.241 | 37849: \"Info: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[cinder::config::end]\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]: The container Oslo::Log[cinder_config] will propagate my refresh event\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.241 | 37849: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_default_format_stri
0.246 | 37850: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ng]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: Oslo::Log[cinder_config]: The container Class[Cinder::Logging] will propagate my refresh event\",
0.246 | 37850: \"Debug: Class[Cinder::Logging]: The container Stage[main] will propagate my refresh event\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.246 | 37850: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_r
0.203 | 37851: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: abbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\",
0.203 | 37851: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Anchor[cinder::config::end]\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 37851: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\",
0.203 | 37851:
0.135 | 37852: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Anchor[cinder::config::end]\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.135 | 37852: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]/ensure: created\",
0.135 | 37852: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]: Scheduling refresh of Anchor[cinder::config::end]\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.135 | 37852: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\",
0.135 | 37852: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Anchor[cinder::config::end]\",
0.135 | 37852: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: The container Oslo::Messaging::Rabbit[cinder_config] will pro
0.197 | 37853: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: pagate my refresh event\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\",
0.197 | 37853: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Anchor[cinder::config::end]\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 37853: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabb
0.222 | 37854: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: it/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: Oslo::Messaging::Rabbit[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.222 | 37854: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messagin
0.190 | 37855: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: g_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 37855: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_co
0.198 | 37856: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: nfig]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 37856: \
0.228 | 37857: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Notice: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]/ensure: created\",
0.228 | 37857: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]: Scheduling refresh of Anchor[cinder::config::end]\",
0.228 | 37857: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]: The container Oslo::Messaging::Default[cinder_config] will propagate my refresh event\",
0.228 | 37857: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]/ensure: created\",
0.228 | 37857: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Anchor[cinder::config::end]\",
0.228 | 37857: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]: The container Oslo::Messaging::Default[cinder_config] will propagate my refresh event\",
0.228 | 37857: \"Debug: Oslo::Messaging::Default[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.228 | 37857: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.228 | 37857: \"Notice: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]/ensure: created\",
0.228 | 37857: \"Info: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Anchor[cinder::config::end]\",
0.228 | 37857: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]: The container Oslo::Concurrency[cinder_config] will propagate my refresh event\",
0.228 | 37857: \"Debug: Oslo::Concurrency[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.228 | 37857: \"Debug: Class[Cinder]: The container Stage[main] will propagate my refresh event\",
0.228 | 37857:
0.163 | 37858: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Notice: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]/ensure: created\",
0.163 | 37858: \"Info: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]: Scheduling refresh of Anchor[cinder::config::end]\",
0.163 | 37858: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]: The container Oslo::Messaging::Notifications[cinder_config] will propagate my refresh event\",
0.163 | 37858: \"Notice: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/ensure: created\",
0.163 | 37858: \"Info: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]: Scheduling refresh of Anchor[cinder::config::end]\",
0.163 | 37858: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]: The container Oslo::Messaging::Notifications[cinder_config] will propagate my refresh event\",
0.163 | 37858: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.163 | 37858: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Triggered 'refresh' from 22 events\",
0.163 | 37858: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.163 | 37858: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.163 | 37858: \"Debug: Oslo::Messaging::Notifications[cinder_config]: The container Class[Cinder::Ceilometer] will propagate my refresh event\",
0.163 | 37858: \"Debug: Class[Cinder::Ceilometer]: The container Stage[m
0.211 | 37859: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: ain] will propagate my refresh event\",
0.211 | 37859: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-q2s22q returned \",
0.211 | 37859: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-q2s22q property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.211 | 37859: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.211 | 37859: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1hfhb4s returned \",
0.211 | 37859: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1hfhb4s property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.211 | 37859: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.211 | 37859: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 37859: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 37859: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 37859: \"Notice: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]/ensure: created\",
0.211 | 37859: \"Info: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]: Scheduling refresh of Anchor[tacker::config::end]\",
0.211 | 37859: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]: The container Oslo::Log[tacker_config] will propagate my refresh event\",
0.211 | 37859: \"Notice: /Stage[main]/Tacker::Logging/Oslo::
0.008 | 37860: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: Log[tacker_config]/Tacker_config[DEFAULT/log_dir]/ensure: created\",
0.008 | 37860: \"Info: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[tacker::config::end]\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]: The container Oslo::Log[tacker_config] will propagate my refresh event\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 37860: \"Debug: /S

0.113 | 38510: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::amqp_username in JSON backend",
0.113 | 38511: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::amqp_password in JSON backend",
0.075 | 38512: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::package_ensure in JSON backend",
0.335 | 38513: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::api_paste_config in JSON backend",
0.217 | 38514: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::use_syslog in JSON backend",
0.217 | 38515: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::use_stderr in JSON backend",
0.376 | 38516: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::log_facility in JSON backend",
0.227 | 38517: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::log_dir in JSON backend",
0.207 | 38518: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::debug in JSON backend",
0.342 | 38519: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::storage_availability_zone in JSON backend",
0.342 | 38520: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::default_availability_zone in JSON backend",
0.348 | 38521: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::allow_availability_zone_fallback in JSON backend",
0.313 | 38522: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::enable_v3_api in JSON backend",
0.325 | 38523: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::lock_path in JSON backend",
0.347 | 38524: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::image_conversion_dir in JSON backend",
0.391 | 38525: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::purge_config in JSON backend",
0.324 | 38526: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backend_host in JSON backend",
0.000 | 38527: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::enable_v1_api in JSON backend",
0.000 | 38528: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::enable_v2_api in JSON backend",
0.386 | 38529: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::use_ssl in JSON backend",
0.378 | 38530: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::ca_file in JSON backend",
0.400 | 38531: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cert_file in JSON backend",
0.379 | 38532: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::key_file in JSON backend",
0.306 | 38533: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::rabbit_host in JSON backend",
0.306 | 38534: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::rabbit_hosts in JSON backend",
0.318 | 38535: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::rabbit_virtual_host in JSON backend",
0.100 | 38536: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::host in JSON backend",

0.000 | 38553: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::db::database_retry_interval in JSON backend",
0.000 | 38554: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::db::database_max_overflow in JSON backend",
0.100 | 38555: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/logging.pp' in environment production",
0.226 | 38556: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::logging from cinder/logging into production",
0.112 | 38557: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::logging::use_syslog in JSON backend",
0.112 | 38558: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::logging::use_stderr in JSON backend",
0.306 | 38559: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::logging::log_facility in JSON backend",
0.117 | 38560: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::logging::log_dir in JSON backend",

0.000 | 38578: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported oslo::messaging::amqp from oslo/messaging/amqp into production",
0.000 | 38579: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/oslo/manifests/messaging/default.pp' in environment production",
0.000 | 38580: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported oslo::messaging::default from oslo/messaging/default into production",
0.205 | 38581: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/oslo/manifests/concurrency.pp' in environment production",
0.311 | 38582: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported oslo::concurrency from oslo/concurrency into production",
0.176 | 38583: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/ceilometer.pp' in environment production",

0.000 | 38588: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported oslo::messaging::notifications from oslo/messaging/notifications into production",
0.146 | 38589: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/config.pp' in environment production",
0.133 | 38590: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::config from cinder/config into production",
0.246 | 38591: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::config::cinder_config in JSON backend",
0.391 | 38592: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::config::api_paste_ini_config in JSON backend",
0.186 | 38593: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/glance.pp' in environment production",
0.192 | 38594: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::glance from cinder/glance into production",
0.404 | 38595: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_servers in JSON backend",
0.404 | 38596: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_version in JSON backend",
0.407 | 38597: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_num_retries in JSON backend",
0.407 | 38598: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_insecure in JSON backend",
0.410 | 38599: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.410 | 38600: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.000 | 38601: Nov 08 21:04:52 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production",

0.048 | 42953: Nov 08 21:06:09 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}
0.048 | 42953: '
0.045 | 42954: Nov 08 21:06:09 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}
0.045 | 42954: --whatprovides'
0.169 | 42955: Nov 08 21:06:09 centos-7-rax-iad-0000787869 puppet-user[106446]: (Package[ceph-common](provider=yum)) Ensuring => present
0.231 | 42956: Nov 08 21:06:09 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/yum -d 0 -e 0 -y install ceph-common'
0.000 | 42957: Nov 08 21:06:10 centos-7-rax-iad-0000787869 su[108684]: pam_unix(su:session): session closed for user rabbitmq

0.000 | 42970: Nov 08 21:06:13 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:python-cephfs-10.2.7-0.el7.x86_64
0.000 | 42971: Nov 08 21:06:13 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:python-rbd-10.2.7-0.el7.x86_64
0.000 | 42972: Nov 08 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:libradosstriper1-10.2.7-0.el7.x86_64
0.250 | 42973: Nov 08 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: boost-program-options-1.53.0-27.el7.x86_64
0.000 | 42974: Nov 08 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: fcgi-2.4.0-25.el7.x86_64

0.000 | 42993: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 42994: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 42995: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]) Nothing to manage: no ensure and the resource doesn't exist
0.425 | 42996: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_manager]) Nothing to manage: no ensure and the resource doesn't exist
0.424 | 42997: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_api_class]) Nothing to manage: no ensure and the resource doesn't exist
0.424 | 42998: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_name_template]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 42999: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_driver]/ensure) created

0.000 | 43017: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]/ensure) created
0.167 | 43018: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]) Scheduling refresh of Anchor[cinder::config::end]
0.155 | 43019: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]) The container Class[Cinder::Backup::Ceph] will propagate my refresh event
0.326 | 43020: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Cinder::Backup::Ceph]) The container Stage[main] will propagate my refresh event
0.370 | 43021: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]) Nothing to manage: no ensure and the resource doesn't exist
0.370 | 43022: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_size]) Nothing to manage: no ensure and the resource doesn't exist
0.370 | 43023: Nov 08 21:06:19 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_ionice]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 43024: Nov 08 21:06:20 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure) created
0.176 | 43025: Nov 08 21:06:20 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]) Scheduling refresh of Anchor[cinder::config::end]
0.186 | 43026: Nov 08 21:06:20 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]) The container Class[Cinder::Backends] will propagate my refresh event
0.269 | 43027: Nov 08 21:06:20 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Cinder::Backends]) The container Stage[main] will propagate my refresh event
0.000 | 43028: Nov 08 21:06:20 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]) Nothing to manage: no ensure and the resource doesn't exist

0.000 | 43145: Nov 08 21:06:22 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]/ensure) created
0.075 | 43146: Nov 08 21:06:22 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]) Scheduling refresh of Anchor[cinder::service::begin]
0.082 | 43147: Nov 08 21:06:22 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]) The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event
0.309 | 43148: Nov 08 21:06:22 centos-7-rax-iad-0000787869 puppet-user[106446]: (Cinder::Backend::Rbd[tripleo_ceph]) The container Class[Tripleo::Profile::Base::Cinder::Volume::Rbd] will propagate my refresh event
0.191 | 43149: Nov 08 21:06:22 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Tripleo::Profile::Base::Cinder::Volume::Rbd]) The container Stage[main] will propagate my refresh event

0.000 | 43282: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retries]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 43283: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_tpool]) Nothing to manage: no ensure and the resource doesn't exist
0.000 | 43284: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Triggered 'refresh' from 2 events
0.296 | 43285: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Scheduling refresh of Service[cinder-backup]
0.197 | 43286: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Scheduling refresh of Service[cinder-volume]
0.000 | 43287: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) The container Class[Cinder::Deps] will propagate my refresh event
0.190 | 43288: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-backup'
0.180 | 43289: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/systemctl is-active openstack-cinder-backup'
0.493 | 43290: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) Skipping restart; service is not running
0.363 | 43291: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) Triggered 'refresh' from 1 events
0.352 | 43292: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) Scheduling refresh of Anchor[cinder::service::end]
0.414 | 43293: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) The container Class[Cinder::Backup] will propagate my refresh event
0.252 | 43294: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Cinder::Backup]) The container Stage[main] will propagate my refresh event
0.215 | 43295: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-volume'
0.203 | 43296: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/systemctl is-active openstack-cinder-volume'
0.332 | 43297: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) Skipping restart; service is not running
0.359 | 43298: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) Triggered 'refresh' from 1 events
0.195 | 43299: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) Scheduling refresh of Anchor[cinder::service::end]
0.342 | 43300: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) The container Class[Cinder::Volume] will propagate my refresh event
0.000 | 43301: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::end]) Triggered 'refresh' from 2 events
0.000 | 43302: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::end]) The container Class[Cinder::Deps] will propagate my refresh event
0.000 | 43303: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Cinder::Deps]) The container Stage[main] will propagate my refresh event
0.228 | 43304: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (Class[Cinder::Volume]) The container Stage[main] will propagate my refresh event
0.000 | 43305: Nov 08 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: Executing: '/usr/bin/systemctl is-active openstack-tacker-server'

0.013 | 46959: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: der/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]/notify: subscribes to Anchor[cinder::config::end]\",
0.013 | 46959: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]
0.033 | 46960: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: /notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]/notify: subscribes to Anchor[cinder::config::end]\",
0.033 | 46960: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_ba
0.007 | 46961: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ckend_name]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.007 | 46961: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend:
0.230 | 46962: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: :Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]/notify: subscribes to Anchor[cinder::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]/notify: subscribes to Anchor[cinder::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]/before: subscribes to Anchor[cinder::install::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]/notify: subscribes to Anchor[cinder::service::begin]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]/notify: subscribes to Anchor[tacker::config::end]\",
0.230 | 46962: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]/notify: subscribes to Anchor[tacker:
0.005 | 46963: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: :config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.005 | 46963: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_uuid_format]/notify: subscribes to Anchor[tacker::con

0.031 | 47023: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: sh event\",
0.031 | 47023: \"Info: Computing checksum on file /etc/sysconfig/snmpd\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmpd.sysconfig]: Filebucketed /etc/sysconfig/snmpd to puppet with sum e914149a715dc82812a989314c026305\",
0.031 | 47023: \"Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{md5}e914149a715dc82812a989314c026305' to '{md5}1483b6eecf3d4796dac2df692d603719'\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmpd.sysconfig]: Scheduling refresh of Service[snmpd]\",
0.031 | 47023: \"Debug: /Stage[main]/Snmp/File[snmpd.sysconfig]: The container Class[Snmp] will propagate my refresh event\",
0.031 | 47023: \"Info: Computing checksum on file /etc/snmp/snmptrapd.conf\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmptrapd.conf]: Filebucketed /etc/snmp/snmptrapd.conf to puppet with sum 913e2613413a45daa402d0fbdbaba676\",
0.031 | 47023: \"Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{md5}913e2613413a45daa402d0fbdbaba676' to '{md5}0f92e52f70b5c64864657201eb9581bb'\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmptrapd.conf]: Scheduling refresh of Service[snmptrapd]\",
0.031 | 47023: \"Debug: /Stage[main]/Snmp/File[snmptrapd.conf]: The container Class[Snmp] will propagate my refresh event\",
0.031 | 47023: \"Info: Computing checksum on file /etc/sysconfig/snmptrapd\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: Filebucketed /etc/sysconfig/snmptrapd to puppet with sum 4496fd5e0e88e764e7beb1ae8f0dda6a\",
0.031 | 47023: \"Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{md5}4496fd5e0e88e764e7beb1ae8f0dda6a' to '{md5}01f68b1480c1ec4e3cc125434dd612a0'\",
0.031 | 47023: \"Info: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: Scheduling refresh of Service[snmptrapd]\",
0.031 | 47023: \"Debug: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: The container Class[Snmp] will propagate my refresh event\",
0.031 | 47023: \"Debug: Executing: '/usr/bin/systemctl is-active snmptrapd'\",
0.031 | 47023: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmptrapd'\",
0.031 | 47023: \"Debug: /Stage[m
0.060 | 47024: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ain]/Snmp/Service[snmptrapd]: Skipping restart; service is not running\",
0.060 | 47024: \"Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events\",
0.060 | 47024: \"Debug: /Stage[main]/Snmp/Service[snmptrapd]: The container Class[Snmp] will propagate my refresh event\",
0.060 | 47024: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-enabled iptables'\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.060 | 47024: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.060 | 47024: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.060 | 47024: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.060 | 47024: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.060 | 47024: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.060 | 47024: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.060 | 47024: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.060 | 47024: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.060 | 47024: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.060 | 47024: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tripleo::Profile::Base::Kernel] will propagate my refresh event\",
0.060 | 47024:
0.000 | 47025: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: Prefetching parsed resources for sysctl\",
0.000 | 47025: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.000 | 47025: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.000 | 47025: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.000 | 47025: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1hlkod7 returned \",
0.000 | 47025: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1hlkod7 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.000 | 47025: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/null 2>&1 -> \",
0.000 | 47025: \"Debug: Exec[create-snmpv3-user-ro_snmp_user](provider=posix): Executing 'service snmpd stop ; sleep 5 ; echo \\\"createUser ro_snmp_user MD5 \\\\\\\"314c09c8f3fb56e8a169c4375356bb75fbdbf3d3\\\\\\\"\\\" >>/var/lib/net-snmp/snmpd.conf && touch /var/lib/net-snmp/ro_snmp_user-snmpd'\",
0.000 | 47025: \"Debug: Executing with uid=root: 'service snmpd stop ; sleep 5 ; echo \\\"createUser ro_snmp_user MD5 \\\\\\\"314c09c8f3fb56e8a169c4375356bb75fbdbf3d3\\\\\\\"\\\" >>/var/lib/net-snmp/snmpd.conf && touch /var/lib/net-snmp/ro_snmp_user-snmpd'\",
0.000 | 47025: \"Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully\",
0.000 | 47025: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]: The container Snmp::Snmpv3_user[ro_snmp_user] will propagate my refresh event\",
0.000 | 47025: \"Debug: Snmp::Snmpv3_user[ro_snmp_user]: The container Class[Tripleo::Profile::Base::Snmp] will propagate my refresh event\",
0.000 | 47025: \"Debug: Class[Tripleo::Profile::Base::Snmp]: The container Stage[main] will propagate my refresh event\",
0.000 | 47025: \"Debug: Executing: '/usr/bin/systemctl is-active snmpd'\",
0.000 | 47025: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmpd'\",
0.000 | 47025:
0.298 | 47026: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: Executing: '/usr/bin/systemctl unmask snmpd'\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/systemctl start snmpd'\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/systemctl enable snmpd'\",
0.298 | 47026: \"Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'\",
0.298 | 47026: \"Debug: /Stage[main]/Snmp/Service[snmpd]: The container Class[Snmp] will propagate my refresh event\",
0.298 | 47026: \"Info: /Stage[main]/Snmp/Service[snmpd]: Unscheduling refresh on Service[snmpd]\",
0.298 | 47026: \"Debug: Class[Snmp]: The container Stage[main] will propagate my refresh event\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.298 | 47026: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1p862cb returned \",
0.298 | 47026: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1p862cb property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.298 | 47026: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.298 | 47026: \"Debug: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.298 | 47026: '\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.298 | 47026: --whatprovides'\",
0.298 | 47026: \"Debug: Package[ceph-common](provider=yum): Ensuring => present\",
0.298 | 47026: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install ceph-common'\",
0.298 | 47026: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]/ensure: created\",
0.298 | 47026: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]: The container Cinder::Backend::Rbd[tripleo_
0.329 | 47027: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ceph] will propagate my refresh event\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_manager]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_api_class]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_name_template]: Nothing to manage: no ensure and the resource doesn't exist\",
0.329 | 47027: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_driver]/ensure: created\",
0.329 | 47027: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_drive
0.201 | 47028: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: r]: Scheduling refresh of Anchor[cinder::config::end]\",
0.201 | 47028: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_driver]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.201 | 47028: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]/ensure: created\",
0.201 | 47028: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]: Scheduling refresh of Anchor[cinder::config::end]\",
0.201 | 47028: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.201 | 47028: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]/ensure: created\",
0.201 | 47028: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]: Scheduling refresh of Anchor[cinder::config::end]\",
0.201 | 47028: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.201 | 47028: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]/ensure: created\",
0.201 | 47028: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]: Scheduling refresh of Anchor[cinder::config::end]\",
0.201 | 47028: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.201 | 47028: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]/ensure: created\",
0.201 | 47028: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]: Scheduling refresh of Anchor[cinder::config::end]\",
0.201 | 47028: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.201 | 47028: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ce
0.266 | 47029: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ph_stripe_unit]/ensure: created\",
0.266 | 47029: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_unit]: Scheduling refresh of Anchor[cinder::config::end]\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_unit]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.266 | 47029: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]/ensure: created\",
0.266 | 47029: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]: Scheduling refresh of Anchor[cinder::config::end]\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.266 | 47029: \"Debug: Class[Cinder::Backup::Ceph]: The container Stage[main] will propagate my refresh event\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Nothing to manage: no ensure and the resource doesn't exist\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_ionice]: Nothing to manage: no ensure and the resource doesn't exist\",
0.266 | 47029: \"Notice: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure: created\",
0.266 | 47029: \"Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Anchor[cinder::config::end]\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: The container Class[Cinder::Backends] will propagate my refresh event\",
0.266 | 47029: \"Debug: Class[Cinder::Backends]: The container Stage[main] will propagate my refresh event\",
0.266 | 47029: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the
0.215 | 47030: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.215 | 47030: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn
0.259 | 47031: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: 't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.259 | 47031: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to mana
0.230 | 47032: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ge: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 47032: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and
0.198 | 47033: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47033: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]:
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 47034: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_am
0.190 | 47035: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: qp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.190 | 47035: \"Debu
0.198 | 47036: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: g: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.198 | 47036: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't
0.269 | 47037: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.269 | 47037: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_backend_name]/ensure: created\",
0.269 | 47037: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_con
0.075 | 47038: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: fig[tripleo_ceph/volume_backend_name]: Scheduling refresh of Anchor[cinder::config::end]\",
0.075 | 47038: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_backend_name]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.075 | 47038: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]/ensure: created\",
0.075 | 47038: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]: Scheduling refresh of Anchor[cinder::config::end]\",
0.075 | 47038: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.075 | 47038: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]/ensure: created\",
0.075 | 47038: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]: Scheduling refresh of Anchor[cinder::config::end]\",
0.075 | 47038: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.075 | 47038: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]/ensure: created\",
0.075 | 47038: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]: Scheduling refresh of Anchor[cinder::config::end]\",
0.075 | 47038: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph
0.093 | 47039: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ]/Cinder_config[tripleo_ceph/rbd_user]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.093 | 47039: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]/ensure: created\",
0.093 | 47039: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]: Scheduling refresh of Anchor[cinder::config::end]\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]: Nothing to manage: no ensure and the resource doesn't exist\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]: Nothing to manage: no ensure and the resource doesn't exist\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]: Nothing to manage: no ensure and the resource doesn't exist\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.093 | 47039: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]: Nothing to manage: no ens
0.100 | 47040: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ure and the resource doesn't exist\",
0.100 | 47040: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.100 | 47040: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]/ensure: created\",
0.100 | 47040: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]: Scheduling refresh of Anchor[cinder::config::end]\",
0.100 | 47040: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.100 | 47040: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Triggered 'refresh' from 14 events\",
0.100 | 47040: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.100 | 47040: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.100 | 47040: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File[/etc/sysconfig/openstack-cinder-volume]/ensure: created\",
0.100 | 47040: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File[/etc/sysconfig/openstack-cinder-volume]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.100 | 47040: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]/ensure: created\",
0.100 | 47040: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]: Scheduling refresh of Anchor[cinder::servi
0.216 | 47041: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: ce::begin]\",
0.216 | 47041: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.216 | 47041: \"Debug: Cinder::Backend::Rbd[tripleo_ceph]: The container Class[Tripleo::Profile::Base::Cinder::Volume::Rbd] will propagate my refresh event\",
0.216 | 47041: \"Debug: Class[Tripleo::Profile::Base::Cinder::Volume::Rbd]: The container Stage[main] will propagate my refresh event\",
0.216 | 47041: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-h8qtqt returned \",
0.216 | 47041: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-h8qtqt property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.216 | 47041: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.216 | 47041: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource d
0.008 | 47042: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: oesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 47042: \"Debug: /Stage[ma

0.002 | 47050: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: g]/Tacker_config[keystone_authtoken/keyfile]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_conn_get_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_dead_retry]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_maxsize]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_socket_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_unused_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_secret_key]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_security_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_use_advanced_pool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 47050: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/
0.026 | 47051: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: Tacker_config[keystone_authtoken/memcached_servers]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/region_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/token_cache_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.026 | 47051: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_retrie
0.004 | 47052: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: s]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 47052: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Triggered 'refresh'
0.294 | 47053: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: from 2 events\",
0.294 | 47053: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Scheduling refresh of Service[cinder-backup]\",
0.294 | 47053: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Scheduling refresh of Service[cinder-volume]\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.294 | 47053: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-backup'\",
0.294 | 47053: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-cinder-backup'\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Skipping restart; service is not running\",
0.294 | 47053: \"Notice: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Triggered 'refresh' from 1 events\",
0.294 | 47053: \"Info: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Scheduling refresh of Anchor[cinder::service::end]\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Backup/Service[cinder-backup]: The container Class[Cinder::Backup] will propagate my refresh event\",
0.294 | 47053: \"Debug: Class[Cinder::Backup]: The container Stage[main] will propagate my refresh event\",
0.294 | 47053: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-volume'\",
0.294 | 47053: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-cinder-volume'\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Skipping restart; service is not running\",
0.294 | 47053: \"Notice: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Triggered 'refresh' from 1 events\",
0.294 | 47053: \"Info: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Scheduling refresh of Anchor[cinder::service::end]\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Volume/Service[cinder-volume]: The container Class[Cinder::Volume] will propagate my refresh event\",
0.294 | 47053: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: Triggered 'refresh' from 2 events\",
0.294 | 47053: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: The contain
0.281 | 47054: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: er Class[Cinder::Deps] will propagate my refresh event\",
0.281 | 47054: \"Debug: Class[Cinder::Deps]: The container Stage[main] will propagate my refresh event\",
0.281 | 47054: \"Debug: Class[Cinder::Volume]: The container Stage[main] will propagate my refresh event\",
0.281 | 47054: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-tacker-server'\",
0.281 | 47054: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-tacker-server'\",
0.281 | 47054: \"Debug: Prefetching iptables resources for firewall\",
0.281 | 47054: \"Debug: Puppet::Type::Firewall::ProviderIptables: [prefetch(resources)]\",
0.281 | 47054: \"Debug: Puppet::Type::Firewall::ProviderIptables: [instances]\",
0.281 | 47054: \"Debug: Executing: '/usr/sbin/iptables-save'\",
0.281 | 47054: \"Debug: Prefetching ip6tables resources for firewall\",
0.281 | 47054: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [prefetch(resources)]\",
0.281 | 47054: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [instances]\",
0.281 | 47054: \"Debug: Executing: '/usr/sbin/ip6tables-save'\",
0.281 | 47054: \"Debug: Finishing transaction 54723480\",
0.281 | 47054: \"Debug: Storing state\",
0.281 | 47054: \"Debug: Stored state in 0.07 seconds\",
0.281 | 47054: \"Notice: Applied catalog in 33.72 seconds\",
0.281 | 47054: \"Debug: Applying settings catalog for sections reporting, metrics\",
0.281 | 47054: \"Debug: Finishing transaction 105888360\",
0.281 | 47054: \"Debug: Received report to process from centos-7-rax-iad-0000787869.localdomain\",
0.281 | 47054: \"Debug: Processing report from centos-7-rax-iad-0000787869.localdomain with processor Puppet::Reports::Store\"
0.281 | 47054: ],
0.281 | 47054: \"failed_when_result\": false
0.281 | 47054: }
0.281 | 47054:
0.281 | 47054: TASK [Run docker-puppet tasks (generate config)] *******************************
0.281 | 47054: skipping: [localhost]
0.281 | 47054:
0.281 | 47054: TASK [debug] *******************************************************************
0.281 | 47054: ok: [localhost] => {
0.281 | 47054: \"(outputs.stderr|default('')).split('\
0.281 | 47054: ')|union(outputs.stdout_lines|default([]))\": [
0.281 | 47054: \"\"
0.281 | 47054: ],
0.281 | 47054: \"failed_when_result\": false
0.281 | 47054: }
0.281 | 47054:
0.281 | 47054: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-star
0.000 | 47055: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: tup-config-step_4.json exists] ***
0.000 | 47055: ok: [localhost]
0.000 | 47055:
0.000 | 47055: TASK [Start containers for step 4] *********************************************
0.000 | 47055: ok: [localhost]
0.000 | 47055:
0.000 | 47055: TASK [debug] *******************************************************************
0.000 | 47055: ok: [localhost] => {
0.000 | 47055: \"(outputs.stderr|default('')).split('\
0.000 | 47055: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 47055: \"stdout: 6ae6fe49e2f13e65af25e6d92bf09940d5048c3d8257ac4cb68aad6a449b5155\",
0.000 | 47055: \"\",
0.000 | 47055: \"stderr: Unable to find image '192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' locally\",
0.000 | 47055: \"Trying to pull repository 192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator ... \",
0.000 | 47055: \"3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d: Pulling from 192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator\",
0.000 | 47055: \"d9aaf4d82f24: Already exists\",
0.000 | 47055: \"615fb2b6a1f1: Already exists\",
0.000 | 47055: \"3013007117c8: Already exists\",
0.000 | 47055: \"72133c850d33: Already exists\",
0.000 | 47055: \"c2baf92c99f8: Already exists\",
0.000 | 47055: \"c33a905d0cfb: Already exists\",
0.000 | 47055: \"0e2281a8f625: Already exists\",
0.000 | 47055: \"8db9532c7c2a: Already exists\",
0.000 | 47055: \"a2fdf405ce12: Already exists\",
0.000 | 47055: \"b4d23af701db: Already exists\",
0.000 | 47055: \"c0364d012ec6: Already exists\",
0.000 | 47055: \"5da3106f315c: Already exists\",
0.000 | 47055: \"7115c908a774: Already exists\",
0.000 | 47055: \"6bfb3cfd80b3: Already exists\",
0.000 | 47055: \"8d6928a9593d: Already exists\",
0.000 | 47055: \"26bc5dc8da6d: Already exists\",
0.000 | 47055: \"76a6f33737df: Already exists\",
0.000 | 47055: \"f200f3bea052: Already exists\",
0.000 | 47055: \"86a355b07979: Already exists\",
0.000 | 47055: \"f6c0fe59d156: Already exists\",
0.000 | 47055: \"2d2aa5dd2564: Already exists\",
0.000 | 47055: \"6478d58b62d6: Already exists\",
0.000 | 47055: \"a3747001f778: Already exists\",
0.000 | 47055: \"f50228f8bd7f: Already exists\",
0.000 | 47055: \"3f77d8c2dda3: Already exists\",
0.000 | 47055: \"984000356753: Already exists\",
0.000 | 47055: \"54bb31bcfca3: Already ex

0.000 | 47841: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.000 | 47842: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.180 | 47843: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/backup.pp' in environment production",
0.213 | 47844: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::backup from cinder/backup into production",
0.246 | 47845: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::enabled in JSON backend",
0.193 | 47846: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::manage_service in JSON backend",
0.148 | 47847: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::package_ensure in JSON backend",
0.351 | 47848: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::backup_manager in JSON backend",
0.350 | 47849: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::backup_api_class in JSON backend",
0.349 | 47850: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::backup_name_template in JSON backend",
0.372 | 47851: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::backup_topic in JSON backend",
0.231 | 47852: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/backup/ceph.pp' in environment production",
0.316 | 47853: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::backup::ceph from cinder/backup/ceph into production",
0.445 | 47854: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_driver in JSON backend",
0.445 | 47855: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_conf in JSON backend",
0.445 | 47856: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_user in JSON backend",
0.445 | 47857: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_chunk_size in JSON backend",
0.445 | 47858: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_pool in JSON backend",
0.445 | 47859: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_stripe_unit in JSON backend",
0.445 | 47860: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_stripe_count in JSON backend",
0.000 | 47861: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production",

0.000 | 47889: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::step in JSON backend",
0.000 | 47890: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder_user_enabled_backends in JSON backend",
0.171 | 47891: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/volume.pp' in environment production",
0.240 | 47892: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::volume from cinder/volume into production",
0.176 | 47893: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::package_ensure in JSON backend",
0.289 | 47894: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::enabled in JSON backend",
0.218 | 47895: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::manage_service in JSON backend",
0.361 | 47896: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::volume_clear in JSON backend",
0.360 | 47897: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::volume_clear_size in JSON backend",
0.361 | 47898: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::volume::volume_clear_ionice in JSON backend",
0.182 | 47899: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/volume/rbd.pp' in environment production",
0.086 | 47900: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported tripleo::profile::base::cinder::volume::rbd from tripleo/profile/base/cinder/volume/rbd into production",
0.248 | 47901: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::backend_name in JSON backend",
0.248 | 47902: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::cinder_rbd_backend_host in JSON backend",
0.036 | 47903: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::cinder_rbd_pool_name in JSON backend",

0.000 | 48309: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_api_insecure] with 'before'",
0.000 | 48310: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_api_ssl_compression] with 'before'",
0.000 | 48311: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_request_timeout] with 'before'",
0.207 | 48312: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_manager] with 'before'",
0.205 | 48313: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_api_class] with 'before'",
0.204 | 48314: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_name_template] with 'before'",
0.219 | 48315: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_driver] with 'before'",
0.219 | 48316: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_conf] with 'before'",
0.219 | 48317: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_user] with 'before'",
0.219 | 48318: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_chunk_size] with 'before'",
0.219 | 48319: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_pool] with 'before'",
0.219 | 48320: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_stripe_unit] with 'before'",
0.219 | 48321: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_stripe_count] with 'before'",
0.036 | 48322: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/volume_clear] with 'before'",

0.000 | 48454: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_api_insecure] to Anchor[cinder::config::end] with 'notify'",
0.000 | 48455: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_api_ssl_compression] to Anchor[cinder::config::end] with 'notify'",
0.000 | 48456: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_request_timeout] to Anchor[cinder::config::end] with 'notify'",
0.217 | 48457: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_manager] to Anchor[cinder::config::end] with 'notify'",
0.216 | 48458: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_api_class] to Anchor[cinder::config::end] with 'notify'",
0.215 | 48459: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_name_template] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48460: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_driver] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48461: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_conf] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48462: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_user] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48463: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_chunk_size] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48464: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_pool] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48465: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_stripe_unit] to Anchor[cinder::config::end] with 'notify'",
0.230 | 48466: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_stripe_count] to Anchor[cinder::config::end] with 'notify'",
0.038 | 48467: Nov 08 21:07:14 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Cinder_config[DEFAULT/volume_clear] to Anchor[cinder::config::end] with 'notify'",

0.000 | 48996: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysctl.conf]/selrole: Found selrole default 'object_r' for /etc/sysctl.conf",
0.000 | 48997: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysctl.conf]/seltype: Found seltype default 'system_conf_t' for /etc/sysctl.conf",
0.000 | 48998: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysctl.conf]/selrange: Found selrange default 's0' for /etc/sysctl.conf",
0.323 | 48999: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/seluser: Found seluser default 'system_u' for /etc/sysconfig/openstack-cinder-volume",
0.239 | 49000: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/selrole: Found selrole default 'object_r' for /etc/sysconfig/openstack-cinder-volume",
0.323 | 49001: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/seltype: Found seltype default 'etc_t' for /etc/sysconfig/openstack-cinder-volume",
0.257 | 49002: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/selrange: Found selrange default 's0' for /etc/sysconfig/openstack-cinder-volume",
0.000 | 49003: Nov 08 21:07:15 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Firewall[000 accept related established rules ipv4]: [validate]",

0.185 | 52598: Nov 08 21:08:40 centos-7-rax-iad-0000787869 puppet-user[119948]: Create: resource exists false location exists false
0.000 | 52599: Nov 08 21:08:41 centos-7-rax-iad-0000787869 sudo[123541]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.000 | 52600: Nov 08 21:08:41 centos-7-rax-iad-0000787869 puppet-user[119948]: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 returned
0.365 | 52601: Nov 08 21:08:41 centos-7-rax-iad-0000787869 puppet-user[119948]: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 resource create openstack-cinder-backup systemd:openstack-cinder-backup op start timeout=200s stop timeout=200s --disabled
0.000 | 52602: Nov 08 21:08:41 centos-7-rax-iad-0000787869 su[123593]: (to rabbitmq) root on none

0.070 | 52668: Nov 08 21:08:47 centos-7-rax-iad-0000787869 crmd[12542]: notice: Initiating start operation openstack-cinder-backup_start_0 locally on centos-7-rax-iad-0000787869
0.000 | 52669: Nov 08 21:08:47 centos-7-rax-iad-0000787869 systemd[1]: Reloading.
0.000 | 52670: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1.orig returned 0 -> CIB updated
0.410 | 52671: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]/ensure) created
0.364 | 52672: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]) The container Pacemaker::Resource::Systemd[openstack-cinder-backup] will propagate my refresh event
0.376 | 52673: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (Pacemaker::Resource::Systemd[openstack-cinder-backup]) The container Pacemaker::Resource::Service[openstack-cinder-backup] will propagate my refresh event
0.327 | 52674: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (Pacemaker::Resource::Service[openstack-cinder-backup]) The container Class[Tripleo::Profile::Pacemaker::Cinder::Backup] will propagate my refresh event
0.269 | 52675: Nov 08 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (Class[Tripleo::Profile::Pacemaker::Cinder::Backup]) The container Stage[main] will propagate my refresh event
0.000 | 52676: Nov 08 21:08:47 centos-7-rax-iad-0000787869 systemd[1]: [/etc/systemd/system/ceph-mon@.service:8] Executable path is not absolute, ignoring: $(command -v mkdir) -p /etc/ceph /var/lib/ceph/mon
0.000 | 52677: Nov 08 21:08:47 centos-7-rax-iad-0000787869 systemd[1]: [/usr/lib/systemd/system/ip6tables.service:3] Failed to add dependency on syslog.target,iptables.service, ignoring: Invalid argument
0.000 | 52678: Nov 08 21:08:47 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 52679: Nov 08 21:08:47 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.504 | 52680: Nov 08 21:08:48 centos-7-rax-iad-0000787869 systemd[1]: Started Cluster Controlled openstack-cinder-backup.
0.513 | 52681: Nov 08 21:08:48 centos-7-rax-iad-0000787869 systemd[1]: Starting Cluster Controlled openstack-cinder-backup...
0.000 | 52682: Nov 08 21:08:48 centos-7-rax-iad-0000787869 sudo[124343]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00

0.000 | 52704: Nov 08 21:08:52 centos-7-rax-iad-0000787869 puppet-user[119948]: Create: resource exists false location exists false
0.000 | 52705: Nov 08 21:08:52 centos-7-rax-iad-0000787869 sudo[124425]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.000 | 52706: Nov 08 21:08:52 centos-7-rax-iad-0000787869 puppet-user[119948]: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb returned
0.312 | 52707: Nov 08 21:08:52 centos-7-rax-iad-0000787869 puppet-user[119948]: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb resource create openstack-cinder-volume systemd:openstack-cinder-volume op start timeout=200s stop timeout=200s --disabled
0.000 | 52708: Nov 08 21:08:53 centos-7-rax-iad-0000787869 sudo[124436]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00

0.070 | 52774: Nov 08 21:08:59 centos-7-rax-iad-0000787869 crmd[12542]: notice: Initiating start operation openstack-cinder-volume_start_0 locally on centos-7-rax-iad-0000787869
0.000 | 52775: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: Reloading.
0.000 | 52776: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut.orig returned 0 -> CIB updated
0.214 | 52777: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]/ensure) created
0.159 | 52778: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]) The container Pacemaker::Resource::Systemd[openstack-cinder-volume] will propagate my refresh event
0.185 | 52779: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (Pacemaker::Resource::Systemd[openstack-cinder-volume]) The container Pacemaker::Resource::Service[openstack-cinder-volume] will propagate my refresh event
0.296 | 52780: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (Pacemaker::Resource::Service[openstack-cinder-volume]) The container Class[Tripleo::Profile::Pacemaker::Cinder::Volume] will propagate my refresh event
0.251 | 52781: Nov 08 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (Class[Tripleo::Profile::Pacemaker::Cinder::Volume]) The container Stage[main] will propagate my refresh event
0.000 | 52782: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: [/etc/systemd/system/ceph-mon@.service:8] Executable path is not absolute, ignoring: $(command -v mkdir) -p /etc/ceph /var/lib/ceph/mon
0.000 | 52783: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: [/usr/lib/systemd/system/ip6tables.service:3] Failed to add dependency on syslog.target,iptables.service, ignoring: Invalid argument
0.000 | 52784: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 52785: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.476 | 52786: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: Started Cluster Controlled openstack-cinder-volume.
0.471 | 52787: Nov 08 21:08:59 centos-7-rax-iad-0000787869 systemd[1]: Starting Cluster Controlled openstack-cinder-volume...
0.000 | 52788: Nov 08 21:08:59 centos-7-rax-iad-0000787869 sudo[125181]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00

0.038 | 53671: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: up snmp::trap_service_hasstatus in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::trap_service_hasrestart in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::template_snmpd_conf in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::template_snmpd_sysconfig in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::template_snmptrapd in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::template_snmptrapd_sysconfig in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::openmanage_enable in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::master in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::agentx_perms in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::agentx_ping_interval in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::agentx_socket in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::agentx_timeout in JSON backend\",
0.038 | 53671: \"Debug: hiera(): Looking up snmp::agentx_retries in JSON backend\",
0.038 | 53671: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmpd.conf.erb\",
0.038 | 53671: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.conf.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmpd.conf.erb in 0.01 seconds\",
0.038 | 53671: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.conf.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmpd.conf.erb in 0.00 seconds\",
0.038 | 53671: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmpd.sysconfig-RedHat.erb\",
0.038 | 53671: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb in 0.00 seconds\",
0.038 | 53671: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb in 0.00 seconds\",
0.038 | 53671: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmptrapd.conf.erb\",
0.054 | 53672: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]:
0.054 | 53672: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.conf.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmptrapd.conf.erb in 0.00 seconds\",
0.054 | 53672: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.conf.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmptrapd.conf.erb in 0.00 seconds\",
0.054 | 53672: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmptrapd.sysconfig-RedHat.erb\",
0.054 | 53672: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb in 0.00 seconds\",
0.054 | 53672: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb in 0.00 seconds\",
0.054 | 53672: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.054 | 53672: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.054 | 53672: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.054 | 53672: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.054 | 53672: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.054 | 53672: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.054 | 53672: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.054 | 53672: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.054 | 53672: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.054 | 53672: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.054 | 53672: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.054 | 53672: \"Debug: Automatically imported ssh::params from ssh/params into
0.018 | 53673: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: production\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.018 | 53673: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.018 | 53673: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.018 | 53673: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.018 | 53673: \"Debug: Automatically imported concat from concat into production\",
0.018 | 53673: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.018 | 53673: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.018 | 53673: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.018 | 53673: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.018 | 53673: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/se
0.221 | 53674: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: rvice.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.221 | 53674: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.221 | 53674: \"Debug: hiera(): Looking up ssh::server::service::enable in JSON backend\",
0.221 | 53674: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported timezone from timezone into production\",
0.221 | 53674: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.221 | 53674: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.221 | 53674: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.221 | 53674: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.221 | 53674: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.221 | 53674: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.221 | 53674: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.221 | 53674: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.221 | 53674: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.221 | 53674: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.221 | 53674: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/profile/base/cinder into production\",
0.221 | 53674: \"De
0.066 | 53675: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: bug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::step in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.066 | 53675: \"Debug: hiera(): Looking up messaging_notify_serv

0.000 | 53694: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: ): Looking up firewall::ensure_v6 in JSON backend\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::pkg_ensure in JSON backend\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::service_name in JSON backend\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::service_name_v6 in JSON backend\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::package_name in JSON backend\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::ebtables_manage in JSON backend\",
0.000 | 53694: \"Debug: importing '/etc/puppet/modules/firewall/manifests/linux.pp' in environment production\",
0.000 | 53694: \"Debug: Automatically imported firewall::linux from firewall/linux into production\",
0.000 | 53694: \"Debug: importing '/etc/puppet/modules/firewall/manifests/linux/redhat.pp' in environment production\",
0.000 | 53694: \"Debug: Automatically imported firewall::linux::redhat from firewall/linux/redhat into production\",
0.000 | 53694: \"Debug: hiera(): Looking up firewall::linux::redhat::package_ensure in JSON backend\",
0.000 | 53694: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/rule.pp' in environment production\",
0.000 | 53694: \"Debug: Automatically imported tripleo::firewall::rule from tripleo/firewall/rule into production\",
0.000 | 53694: \"Debug: Resource class[tripleo::firewall::post] was not determined to be defined\",
0.000 | 53694: \"Debug: Create new resource class[tripleo::firewall::post] with params {\\\"firewall_settings\\\"=>{}}\",
0.000 | 53694: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/post.pp' in environment production\",
0.000 | 53694: \"Debug: Automatically imported tripleo::firewall::post from tripleo/firewall/post into production\",
0.000 | 53694: \"Debug: hiera(): Looking up tripleo::firewall::post::debug in JSON backend\",
0.000 | 53694: \"Notice: Scope(Class[Tripleo::Firewall::Post]): At this stage, all network traffic is blocked.\",
0.000 | 53694: \"Debug: hiera(): Looking up service_names in JSON backend\",
0.000 | 53694: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/service_rules.pp' in environment production
0.071 | 53695: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: \",
0.071 | 53695: \"Debug: Automatically imported tripleo::firewall::service_rules from tripleo/firewall/service_rules into production\",
0.071 | 53695: \"Debug: Scope(Kmod::Load[nf_conntrack]): Retrieving template kmod/redhat.modprobe.erb\",
0.071 | 53695: \"Debug: template[/etc/puppet/modules/kmod/templates/redhat.modprobe.erb]: Bound template variables for /etc/puppet/modules/kmod/templates/redhat.modprobe.erb in 0.00 seconds\",
0.071 | 53695: \"Debug: template[/etc/puppet/modules/kmod/templates/redhat.modprobe.erb]: Interpolated template /etc/puppet/modules/kmod/templates/redhat.modprobe.erb in 0.00 seconds\",
0.071 | 53695: \"Debug: Scope(Kmod::Load[nf_conntrack_proto_sctp]): Retrieving template kmod/redhat.modprobe.erb\",
0.071 | 53695: \"Debug: importing '/etc/puppet/modules/sysctl/manifests/base.pp' in environment production\",
0.071 | 53695: \"Debug: Automatically imported sysctl::base from sysctl/base into production\",
0.071 | 53695: \"Debug: template[inline]: Interpolated template inline template in 0.07 seconds\",
0.071 | 53695: \"Debug: importing '/etc/puppet/modules/oslo/manifests/params.pp' in environment production\",
0.071 | 53695: \"Debug: Automatically imported oslo::params from oslo/params into production\",
0.071 | 53695: \"Debug: importing '/etc/puppet/modules/mysql/manifests/bindings.pp' in environment production\",
0.071 | 53695: \"Debug: Automatically imported mysql::bindings from mysql/bindings into production\",
0.071 | 53695: \"Debug: importing '/etc/puppet/modules/mysql/manifests/params.pp' in environment production\",
0.071 | 53695: \"Debug: Automatically imported mysql::params from mysql/params into production\",
0.071 | 53695: \"Debug: hiera(): Looking up mysql::bindings::install_options in JSON backend\",
0.071 | 53695: \"Debug: hiera(): Looking up mysql::bindings::java_enable in JSON backend\",
0.071 | 53695: \"Debug: hiera(): Looking up mysql::bindings::perl_enable in JSON backend\",
0.071 | 53695: \"Debug: hiera(): Looking up mysql::bindings::php_enable in JSON backend\",
0.071 | 53695: \"Debug: hiera(): Looking up mysql::bindings::python_enable in JSON backend\",
0.071 | 53695:
0.001 | 53696: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: hiera(): Looking up mysql::bindings::ruby_enable in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::client_dev in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::java_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::java_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::java_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::perl_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::perl_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::perl_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::php_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::php_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::php_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::python_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::python_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::python_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_ensure in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_name in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_provider in JSON backend\",
0.001 | 53696: \"Debug: hiera(): Looking up mysql::bindings::daemon
0.215 | 53697: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: _dev_package_ensure in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev_package_name in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev_package_provider in JSON backend\",
0.215 | 53697: \"Debug: importing '/etc/puppet/modules/mysql/manifests/bindings/python.pp' in environment production\",
0.215 | 53697: \"Debug: Automatically imported mysql::bindings::python from mysql/bindings/python into production\",
0.215 | 53697: \"Debug: importing '/etc/puppet/modules/pacemaker/manifests/resource/systemd.pp' in environment production\",
0.215 | 53697: \"Debug: Automatically imported pacemaker::resource::systemd from pacemaker/resource/systemd into production\",
0.215 | 53697: \"Debug: Resource package[ceph-common] was not determined to be defined\",
0.215 | 53697: \"Debug: Create new resource package[ceph-common] with params {\\\"ensure\\\"=>\\\"present\\\", \\\"name\\\"=>\\\"ceph-common\\\", \\\"tag\\\"=>\\\"cinder-support-package\\\"}\",
0.215 | 53697: \"Debug: Resource file[/etc/sysconfig/openstack-cinder-volume] was not determined to be defined\",
0.215 | 53697: \"Debug: Create new resource file[/etc/sysconfig/openstack-cinder-volume] with params {\\\"ensure\\\"=>\\\"present\\\"}\",
0.215 | 53697: \"Debug: importing '/etc/puppet/modules/keystone/manifests/deps.pp' in environment production\",
0.215 | 53697: \"Debug: Automatically imported keystone::deps from keystone/deps into production\",
0.215 | 53697: \"Debug: importing '/etc/puppet/modules/oslo/manifests/cache.pp' in environment production\",
0.215 | 53697: \"Debug: Automatically imported oslo::cache from oslo/cache into production\",
0.215 | 53697: \"Debug: hiera(): Looking up tripleo.clustercheck.firewall_rules in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up tripleo.docker.firewall_rules in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up tripleo.kernel.firewall_rules in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up tripleo.keystone.firewall_rules in JSON backend\",
0.215 | 53697: \"Debug: hiera(): Looking up tripleo.glance_api.firewall_
0.001 | 53698: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.heat_api.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.heat_api_cfn.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.heat_engine.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.mysql.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.mysql_client.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_dhcp.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_l3.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_metadata.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_api.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_plugin_ml2.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.neutron_ovs_agent.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.rabbitmq.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.haproxy.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.memcached.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.pacemaker.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.nova_conductor.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.nova_api.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.nova_placement.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.nova_metadata.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.nova_scheduler.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.ntp.firewall_rules in JSON backend\",
0.001 | 53698: \"Debug: hiera(): Looking up tripleo.snmp.firewall_rules in

0.007 | 53795: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: g: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/before: subscribes to Sysctl[net.nf_conntrack_max]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[fs.inotify.max_user_instances]/Sysctl[fs.inotify.max_user_instances]/before: subscribes to Sysctl_runtime[fs.inotify.max_user_instances]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[fs.suid_dumpable]/Sysctl[fs.suid_dumpable]/before: subscribes to Sysctl_runtime[fs.suid_dumpable]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.dmesg_restrict]/Sysctl[kernel.dmesg_restrict]/before: subscribes to Sysctl_runtime[kernel.dmesg_restrict]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.pid_max]/Sysctl[kernel.pid_max]/before: subscribes to Sysctl_runtime[kernel.pid_max]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.core.netdev_max_backlog]/Sysctl[net.core.netdev_max_backlog]/before: subscribes to Sysctl_runtime[net.core.netdev_max_backlog]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.arp_accept]/Sysctl[net.ipv4.conf.all.arp_accept]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.arp_accept]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.log_martians]/Sysctl[net.ipv4.conf.all.log_martians]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.log_martians]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.secure_redirects]/Sysctl[net.ipv4.conf.all.secure_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.secure_redirects]\",
0.007 | 53795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.send_redirects]/Sysctl[net.ipv4.conf.all.send_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.send_redirects]\",
0.007 | 53795:
0.008 | 53796: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.accept_redirects]/Sysctl[net.ipv4.conf.default.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.accept_redirects]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.log_martians]/Sysctl[net.ipv4.conf.default.log_martians]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.log_martians]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.secure_redirects]/Sysctl[net.ipv4.conf.default.secure_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.secure_redirects]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.send_redirects]/Sysctl[net.ipv4.conf.default.send_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.send_redirects]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh1]/Sysctl[net.ipv4.neigh.default.gc_thresh1]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh1]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh2]/Sysctl[net.ipv4.neigh.default.gc_thresh2]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh2]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh3]/Sysctl[net.ipv4.neigh.default.gc_thresh3]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh3]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_intvl]/Sysctl[net.ipv4.tcp_keepalive_intvl]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_keepalive_intvl]\",
0.008 | 53796: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_probes]/Sysctl[net.ipv4.tcp_keepalive_probes]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_ke
0.007 | 53797: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: epalive_probes]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_time]/Sysctl[net.ipv4.tcp_keepalive_time]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_keepalive_time]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_ra]/Sysctl[net.ipv6.conf.all.accept_ra]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.accept_ra]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_redirects]/Sysctl[net.ipv6.conf.all.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.accept_redirects]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.autoconf]/Sysctl[net.ipv6.conf.all.autoconf]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.autoconf]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.disable_ipv6]/Sysctl[net.ipv6.conf.all.disable_ipv6]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.disable_ipv6]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_ra]/Sysctl[net.ipv6.conf.default.accept_ra]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.accept_ra]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_redirects]/Sysctl[net.ipv6.conf.default.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.accept_redirects]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.autoconf]/Sysctl[net.ipv6.conf.default.autoconf]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.autoconf]\",
0.007 | 53797: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.disable_ipv6]/Sysctl[net.ipv6.conf.default.disable_ipv6]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.disable_ipv6]\",
0.007 | 53797: \"Debug: /Stage[ma
0.249 | 53798: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: in]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.netfilter.nf_conntrack_max]/Sysctl[net.netfilter.nf_conntrack_max]/before: subscribes to Sysctl_runtime[net.netfilter.nf_conntrack_max]\",
0.249 | 53798: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.nf_conntrack_max]/Sysctl[net.nf_conntrack_max]/before: subscribes to Sysctl_runtime[net.nf_conntrack_max]\",
0.249 | 53798: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/require: subscribes to Package[snmpd]\",
0.249 | 53798: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/require: subscribes to File[var-net-snmp]\",
0.249 | 53798: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/before: subscribes to Service[snmpd]\",
0.249 | 53798: \"Debug: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/Concat_file[/etc/ssh/sshd_config]/before: subscribes to File[/etc/ssh/sshd_config]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.249 | 53798: \"Debug: /Stage[main]/C
0.016 | 53799: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: inder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.016 | 53799: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]/notify: subscribes to Anchor[cinder::config::end]\",

0.034 | 53867: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: Augeas[docker-sysconfig-network](provider=augeas): Will attempt to save and only run if files changed\",
0.034 | 53867: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): sending command 'rm' with params [\\\"/files/etc/sysconfig/docker-network/DOCKER_NETWORK_OPTIONS\\\"]\",
0.034 | 53867: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Skipping because no files were changed\",
0.034 | 53867: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Closed the augeas connection\",
0.034 | 53867: \"Debug: Executing: '/usr/bin/systemctl is-active docker'\",
0.034 | 53867: \"Debug: Executing: '/usr/bin/systemctl is-enabled docker'\",
0.034 | 53867: \"Debug: Exec[directory-create-etc-my.cnf.d](provider=posix): Executing check 'test -d /etc/my.cnf.d'\",
0.034 | 53867: \"Debug: Executing: 'test -d /etc/my.cnf.d'\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Augeas version 1.4.0 is installed\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Will attempt to save and only run if files changed\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'set' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/bind-address\\\", \\\"192.168.24.15\\\"]\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl\\\"]\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl-ca\\\"]\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Skipping because no files were changed\",
0.034 | 53867: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Closed the augeas connection\",
0.034 | 53867: \"Debug: Executing: '/usr/bin/systemctl is-active pcsd'\",
0.034 | 53867: \"Debug: Executing: '/usr/bin/systemctl is-enabled pcsd'\",
0.034 | 53867:
0.032 | 53868: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: \"Debug: Exec[Create Cluster tripleo_cluster](provider=posix): Executing check '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.032 | 53868: \"Debug: Exec[Start Cluster tripleo_cluster](provider=posix): Executing check '/sbin/pcs status >/dev/null 2>&1'\",
0.032 | 53868: \"Debug: Executing: '/sbin/pcs status >/dev/null 2>&1'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled corosync'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled pacemaker'\",
0.032 | 53868: \"Debug: Exec[wait-for-settle](provider=posix): Executing check '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.032 | 53868: \"Debug: Executing: '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.032 | 53868: \"Debug: defaults exists resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.032 | 53868: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-144lovy returned \",
0.032 | 53868: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-144lovy resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-active chronyd'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled chronyd'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-active ntpd'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled ntpd'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-active snmptrapd'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmptrapd'\",
0.032 | 53868: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.032 | 53868: \"Debug: Executing: '/usr/bin/systemctl is-
0.009 | 53869: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: enabled iptables'\",
0.009 | 53869: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.009 | 53869: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.009 | 53869: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.009 | 53869: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.009 | 53869: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.009 | 53869: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.009 | 53869: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.009 | 53869: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.009 | 53869: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.009 | 53869: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.009 | 53869: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tripleo::Profile::Base::Kernel] will propagate my refresh event\",
0.009 | 53869: \"Debug: Prefetching parsed resources for sysctl\",
0.009 | 53869: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.009 | 53869: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.009 | 53869: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.009 | 53869: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1jb77z7 returned \",
0.009 | 53869: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1jb77z7 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.009 | 53869: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/nul
0.371 | 53870: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: l 2>&1 -> \",
0.371 | 53870: \"Debug: Executing: '/usr/bin/systemctl is-active snmpd'\",
0.371 | 53870: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmpd'\",
0.371 | 53870: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.371 | 53870: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.371 | 53870: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xq7458 returned \",
0.371 | 53870: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xq7458 property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.371 | 53870: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.371 | 53870: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.371 | 53870: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAUL
0.365 | 53871: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: T/glance_request_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: Prefetching crontab resources for cron\",
0.365 | 53871: \"Debug: looking for crontabs in /var/spool/cron\",
0.365 | 53871: \"Notice: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]/ensure: created\",
0.365 | 53871: \"Debug: Flushing cron provider target cinder\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]: The container Class[Cinder::Cron::Db_purge] will propagate my refresh event\",
0.365 | 53871: \"Debug: Class[Cinder::Cron::Db_purge]: The container Stage[main] will propagate my refresh event\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_manager]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_api_class]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_name_template]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_ionice]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.365 | 53871: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resour
0.221 | 53872: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: ce doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.221 | 53872: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resour
0.238 | 53873: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: ce doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.238 | 53873: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_d
0.233 | 53874: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: efault_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.233 | 53874: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kom
0.202 | 53875: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: bu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 53875: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to man
0.213 | 53876: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: age: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 53876: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp
0.205 | 53877: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: [cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.205 | 53877: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resour
0.201 | 53878: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: ce doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.201 | 53878: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing t
0.286 | 53879: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: o manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]: Nothing to manage: no ensure and the resource doesn't exist\",
0.286 | 53879: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]: Nothing to manage: no ensu
0.309 | 53880: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: re and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-h0z0aj returned \",
0.309 | 53880: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-h0z0aj property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.309 | 53880: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.309 | 53880: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.309 | 53880: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watc
0.006 | 53881: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: h_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 53881: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource

0.004 | 53889: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: _retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/include_service_catalog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/keyfile]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_conn_get_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_dead_retry]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_maxsize]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_socket_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_unused_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_secret_key]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 53889: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_security_strate
0.013 | 53890: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: gy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_use_advanced_pool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcached_servers]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/region_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/token_cache_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 53890: \"Debug: /Stage[main]/Tac
0.002 | 53891: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: ker::Db/Oslo::Db[tacker_config]/Tacker_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 53891: \"Debug: /Stage[mai
0.332 | 53892: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: n]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.332 | 53892: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.332 | 53892: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-backup'\",
0.332 | 53892: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-volume'\",
0.332 | 53892: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-tacker-server'\",
0.332 | 53892: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-tacker-server'\",
0.332 | 53892: \"Debug: Prefetching iptables resources for firewall\",
0.332 | 53892: \"Debug: Puppet::Type::Firewall::ProviderIptables: [prefetch(resources)]\",
0.332 | 53892: \"Debug: Puppet::Type::Firewall::ProviderIptables: [instances]\",
0.332 | 53892: \"Debug: Executing: '/usr/sbin/iptables-save'\",
0.332 | 53892: \"Debug: Prefetching ip6tables resources for firewall\",
0.332 | 53892: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [prefetch(resources)]\",
0.332 | 53892: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [instances]\",
0.332 | 53892: \"Debug: Executing: '/usr/sbin/ip6tables-save'\",
0.332 | 53892: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1gzm9jb returned \",
0.332 | 53892: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1gzm9jb constraint list | grep location-openstack-cinder-backup > /dev/null 2>&1\",
0.332 | 53892: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xbogxx returned \",
0.332 | 53892: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xbogxx resource show openstack-cinder-backup > /dev/null 2>&1\",
0.332 | 53892: \"Debug: Exists: resource exists false location exists false\",
0.332 | 53892: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-13qk7pu returned \",
0.105 | 53893: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]:
0.105 | 53893: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-13qk7pu constraint list | grep location-openstack-cinder-backup > /dev/null 2>&1\",
0.105 | 53893: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1g5ohgt returned \",
0.105 | 53893: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1g5ohgt resource show openstack-cinder-backup > /dev/null 2>&1\",
0.105 | 53893: \"Debug: Create: resource exists false location exists false\",
0.105 | 53893: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 returned \",
0.105 | 53893: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 resource create openstack-cinder-backup systemd:openstack-cinder-backup op start timeout=200s stop timeout=200s --disabled\",
0.105 | 53893: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2.orig returned 0 -> CIB updated\",
0.105 | 53893: \"Debug: location_rule_create: constraint location openstack-cinder-backup rule resource-discovery=exclusive score=0 cinder-backup-role eq true\",
0.105 | 53893: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e returned \",
0.105 | 53893: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e constraint location openstack-cinder-backup rule resource-discovery=exclusive score=0 cinder-backup-role eq true\",
0.105 | 53893: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e.orig returned 0 -> CIB updated\",
0.105 | 53893: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 returned \",
0.105 | 53893: \"Debug:
0.239 | 53894: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 resource enable openstack-cinder-backup\",
0.239 | 53894: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1.orig returned 0 -> CIB updated\",
0.239 | 53894: \"Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]/ensure: created\",
0.239 | 53894: \"Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]: The container Pacemaker::Resource::Systemd[openstack-cinder-backup] will propagate my refresh event\",
0.239 | 53894: \"Debug: Pacemaker::Resource::Systemd[openstack-cinder-backup]: The container Pacemaker::Resource::Service[openstack-cinder-backup] will propagate my refresh event\",
0.239 | 53894: \"Debug: Pacemaker::Resource::Service[openstack-cinder-backup]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Backup] will propagate my refresh event\",
0.239 | 53894: \"Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Backup]: The container Stage[main] will propagate my refresh event\",
0.239 | 53894: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-870vjb returned \",
0.239 | 53894: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-870vjb constraint list | grep location-openstack-cinder-volume > /dev/null 2>&1\",
0.239 | 53894: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-31an3f returned \",
0.239 | 53894: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-31an3f resource show openstack-cinder-volume > /dev/null 2>&1\",
0.239 | 53894: \"Debug: backup_cib:
0.085 | 53895: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1to0ytg returned \",
0.085 | 53895: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1to0ytg constraint list | grep location-openstack-cinder-volume > /dev/null 2>&1\",
0.085 | 53895: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-7oghsb returned \",
0.085 | 53895: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-7oghsb resource show openstack-cinder-volume > /dev/null 2>&1\",
0.085 | 53895: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb returned \",
0.085 | 53895: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb resource create openstack-cinder-volume systemd:openstack-cinder-volume op start timeout=200s stop timeout=200s --disabled\",
0.085 | 53895: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb.orig returned 0 -> CIB updated\",
0.085 | 53895: \"Debug: location_rule_create: constraint location openstack-cinder-volume rule resource-discovery=exclusive score=0 cinder-volume-role eq true\",
0.085 | 53895: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm returned \",
0.085 | 53895: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm constraint location openstack-cinder-volume rule resource-discovery=exclusive score=0 cinder-volume-role eq true\",
0.085 | 53895: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm.orig returned 0 -> CIB updated\",
0.085 | 53895: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut returned
0.240 | 53896: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: \",
0.240 | 53896: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut resource enable openstack-cinder-volume\",
0.240 | 53896: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut.orig returned 0 -> CIB updated\",
0.240 | 53896: \"Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]/ensure: created\",
0.240 | 53896: \"Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]: The container Pacemaker::Resource::Systemd[openstack-cinder-volume] will propagate my refresh event\",
0.240 | 53896: \"Debug: Pacemaker::Resource::Systemd[openstack-cinder-volume]: The container Pacemaker::Resource::Service[openstack-cinder-volume] will propagate my refresh event\",
0.240 | 53896: \"Debug: Pacemaker::Resource::Service[openstack-cinder-volume]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Volume] will propagate my refresh event\",
0.240 | 53896: \"Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Volume]: The container Stage[main] will propagate my refresh event\",
0.240 | 53896: \"Debug: Finishing transaction 59572100\",
0.240 | 53896: \"Debug: Storing state\",
0.240 | 53896: \"Debug: Stored state in 0.07 seconds\",
0.240 | 53896: \"Notice: Applied catalog in 39.99 seconds\",
0.240 | 53896: \"Debug: Applying settings catalog for sections reporting, metrics\",
0.240 | 53896: \"Debug: Finishing transaction 107867560\",
0.240 | 53896: \"Debug: Received report to process from centos-7-rax-iad-0000787869.localdomain\",
0.240 | 53896: \"Debug: Processing report from centos-7-rax-iad-0000787869.localdomain with processor Puppet::Reports::Store\"
0.240 | 53896: ],
0.240 | 53896: \"failed_when_result\": false
0.240 | 53896: }
0.240 | 53896:
0.240 | 53896: TASK [R
0.000 | 53897: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: un docker-puppet tasks (generate config)] *******************************
0.000 | 53897: skipping: [localhost]
0.000 | 53897:
0.000 | 53897: TASK [debug] *******************************************************************
0.000 | 53897: ok: [localhost] => {
0.000 | 53897: \"(outputs.stderr|default('')).split('\
0.000 | 53897: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 53897: \"\"
0.000 | 53897: ],
0.000 | 53897: \"failed_when_result\": false
0.000 | 53897: }
0.000 | 53897:
0.000 | 53897: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-startup-config-step_5.json exists] ***
0.000 | 53897: ok: [localhost]
0.000 | 53897:
0.000 | 53897: TASK [Start containers for step 5] *********************************************
0.000 | 53897: ok: [localhost]
0.000 | 53897:
0.000 | 53897: TASK [debug] *******************************************************************
0.000 | 53897: ok: [localhost] => {
0.000 | 53897: \"(outputs.stderr|default('')).split('\
0.000 | 53897: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 53897: \"stdout: 0e6617e3d7c758431a965971ff733027296d33bb028a062c0addfe76f3545538\",
0.000 | 53897: \"\",
0.000 | 53897: \"stderr: Unable to find image '192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' locally\",
0.000 | 53897: \"Trying to pull repository 192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd ... \",
0.000 | 53897: \"3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d: Pulling from 192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd\",
0.000 | 53897: \"d9aaf4d82f24: Already exists\",
0.000 | 53897: \"615fb2b6a1f1: Already exists\",
0.000 | 53897: \"3013007117c8: Already exists\",
0.000 | 53897: \"72133c850d33: Already exists\",
0.000 | 53897: \"c2baf92c99f8: Already exists\",
0.000 | 53897: \"c33a905d0cfb: Already exists\",
0.000 | 53897: \"0e2281a8f625: Already exists\",
0.000 | 53897: \"8db9532c7c2a: Already exists\",
0.000 | 53897: \"a2fdf405ce12: Already exists\",
0.000 | 53897: \"b4d23af701db: Already exists\",
0.000 | 53897: \"c0364d012ec6: Already exists\",
0.000 | 53897: \"5da3106f315c: Already exists\",
0.000 | 53897: \"7115c908a774: Already exists\",
0.000 | 53897: \"6bfb3cfd80b3: Already exists\",
0.000 | 53897: \"8d6928a9593d: Already exists\",
0.000 | 53897: \"26bc5dc8da6d: Already exists\",
0.000 | 53897: \"76a6f33737df: Already exists\", \

0.000 | 54683: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_insecure in JSON backend",
0.000 | 54684: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.000 | 54685: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.305 | 54686: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/cinder/manifests/cron/db_purge.pp' in environment production",
0.201 | 54687: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported cinder::cron::db_purge from cinder/cron/db_purge into production",
0.230 | 54688: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::minute in JSON backend",
0.208 | 54689: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::hour in JSON backend",
0.230 | 54690: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::monthday in JSON backend",
0.230 | 54691: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::month in JSON backend",
0.230 | 54692: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::weekday in JSON backend",
0.108 | 54693: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: hiera(): Looking up cinder::cron::db_purge::user in JSON backend",

0.000 | 54994: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/mysql/manifests/bindings/python.pp' in environment production",
0.000 | 54995: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported mysql::bindings::python from mysql/bindings/python into production",
0.141 | 54996: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: importing '/etc/puppet/modules/pacemaker/manifests/resource/systemd.pp' in environment production",
0.208 | 54997: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Automatically imported pacemaker::resource::systemd from pacemaker/resource/systemd into production",
0.000 | 54998: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Resource package[ceph-common] was not determined to be defined",

0.000 | 55135: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Service[pacemaker] to Exec[wait-for-settle] with 'before'",
0.000 | 55136: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from File[etc-pacemaker] to File[etc-pacemaker-authkey] with 'before'",
0.000 | 55137: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from File[etc-pacemaker-authkey] to Exec[Create Cluster tripleo_cluster] with 'before'",
0.310 | 55138: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_resource[openstack-cinder-backup] with 'before'",
0.244 | 55139: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_resource[openstack-cinder-volume] with 'before'",
0.000 | 55140: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_property[property--stonith-enabled] with 'before'",

0.000 | 56014: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[Start Cluster tripleo_cluster]/before: subscribes to Service[pacemaker]",
0.000 | 56015: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker]/before: subscribes to File[etc-pacemaker-authkey]",
0.000 | 56016: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker-authkey]/before: subscribes to Exec[Create Cluster tripleo_cluster]",
0.294 | 56017: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_resource[openstack-cinder-backup]",
0.208 | 56018: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_resource[openstack-cinder-volume]",
0.000 | 56019: Nov 08 21:09:39 centos-7-rax-iad-0000787869 os-collect-config[5169]: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_property[property--stonith-enabled]",

0.000 | 57816: Nov 08 21:13:36 centos-7-rax-iad-0000787869 dnsmasq-dhcp[146664]: read /var/lib/neutron/dhcp/66144654-fd8c-4d37-9d13-bcc5f153c16d/host
0.000 | 57817: Nov 08 21:13:36 centos-7-rax-iad-0000787869 dnsmasq-dhcp[146664]: read /var/lib/neutron/dhcp/66144654-fd8c-4d37-9d13-bcc5f153c16d/opts
0.000 | 57818: Nov 08 21:13:36 centos-7-rax-iad-0000787869 sudo[146808]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.568 | 57819: Nov 08 21:13:36 centos-7-rax-iad-0000787869 pacemaker_remoted[46554]: warning: new_event_notification (13-32500-17): Broken pipe (32)
0.697 | 57820: Nov 08 21:13:36 centos-7-rax-iad-0000787869 pacemaker_remoted[46554]: warning: Notification of client proxy-cib_rw-32500-23c89608/23c89608-4ae0-4282-889e-ee895be4abc5 failed
0.000 | 57821: Nov 08 21:13:36 centos-7-rax-iad-0000787869 rabbitmq-cluster(rabbitmq)[146864]: DEBUG: rabbitmq monitor : 0

0.000 | 58325: Nov 08 21:13:49 centos-7-rax-iad-0000787869 dockerd-current[11546]: pdict['user'] = self.user
0.000 | 58326: Nov 08 21:13:49 centos-7-rax-iad-0000787869 dockerd-current[11546]: /usr/lib/python2.7/site-packages/glance/context.py:49: DeprecationWarning: Property 'tenant' has moved to 'project_id' in version '2.6' and will be removed in version '3.0'
0.000 | 58327: Nov 08 21:13:49 centos-7-rax-iad-0000787869 dockerd-current[11546]: pdict['tenant'] = self.tenant
0.318 | 58328: Nov 08 21:13:49 centos-7-rax-iad-0000787869 sudo[148202]: cinder : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/bin/cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmpP_BqpGhostgroup@tripleo_ceph
0.000 | 58329: Nov 08 21:13:49 centos-7-rax-iad-0000787869 sudo[148207]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00

0.000 | 58404: Nov 08 21:13:52 centos-7-rax-iad-0000787869 haproxy[61795]: Connect from 192.168.24.14:49938 to 192.168.24.14:5000 (keystone_public/HTTP)
0.000 | 58405: Nov 08 21:13:52 centos-7-rax-iad-0000787869 rabbitmq-cluster(rabbitmq)[148467]: DEBUG: rabbitmq monitor : 0
0.000 | 58406: Nov 08 21:13:52 centos-7-rax-iad-0000787869 sudo[148465]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.265 | 58407: Nov 08 21:13:53 centos-7-rax-iad-0000787869 sudo[148473]: cinder : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/bin/cinder-rootwrap /etc/cinder/rootwrap.conf qemu-img convert -O raw -f qcow2 /var/lib/cinder/conversion/tmpP_BqpGhostgroup@tripleo_ceph /var/lib/cinder/conversion/tmpbDkVuU
0.000 | 58408: Nov 08 21:13:53 centos-7-rax-iad-0000787869 haproxy[61795]: Connect from 192.168.24.14:54920 to 192.168.24.14:35357 (keystone_admin/HTTP)

logs/subnode-2/var/log/messages.txt.gz
0.000 | 5281: Nov 8 20:37:13 localhost os-collect-config: changed: [localhost] => (item={'key': u'step_1', 'value': {u'mysql_image_tag': {u'start_order': 2, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-mariadb:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'mysql_data_ownership': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'chown', u'-R', u'mysql:', u'/var/lib/mysql'], u'user': u'root', u'volumes': [u'/var/lib/mysql:/var/lib/mysql'], u'net': u'host', u'detach': False}, u'memcached_init_logs': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-memcached:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'source /etc/sysconfig/memcached; touch /var/log/memcached.log && chown ${USER} /var/log/memcached.log'], u'user': u'root', u'volumes': [u'/var/lib/config-data/memcached/etc/sysconfig/memcached:/etc/sysconfig/memcached:ro', u'/var/log/containers/memcached:/var/log/'], u'detach': False, u'privileged': False}, u'redis_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-redis:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-redis:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-redis:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro',
0.000 | 5282: Nov 8 20:37:13 localhost os-collect-config: u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'mysql_bootstrap': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS', u'KOLLA_BOOTSTRAP=True', u'KOLLA_KUBERNETES=True', u'DB_MAX_TIMEOUT=60', u'DB_CLUSTERCHECK_PASSWORD=AfXTmCmAbpj8qFgpMZFdGhZym', u'DB_ROOT_PASSWORD=HDARRY0WrF'], u'command': [u'bash', u'-ecx', u'if [ -e /var/lib/mysql/mysql ]; then exit 0; fi
0.000 | 5282: echo -e "\
0.000 | 5282: [mysqld]\
0.000 | 5282: wsrep_provider=none" >> /etc/my.cnf
0.000 | 5282: sudo -u mysql -E kolla_start
0.000 | 5282: mysqld_safe --skip-networking --wsrep-on=OFF &
0.000 | 5282: timeout ${DB_MAX_TIMEOUT} /bin/bash -c \'until mysqladmin -uroot -p"${DB_ROOT_PASSWORD}" ping 2>/dev/null; do sleep 1; done\'
0.000 | 5282: mysql -uroot -p"${DB_ROOT_PASSWORD}" -e "CREATE USER \'clustercheck\'@\'localhost\' IDENTIFIED BY \'${DB_CLUSTERCHECK_PASSWORD}\';"
0.000 | 5282: mysql -uroot -p"${DB_ROOT_PASSWORD}" -e "GRANT PROCESS ON *.* TO \'clustercheck\'@\'localhost\' WITH GRANT OPTION;"
0.000 | 5282: timeout ${DB_MAX_TIMEOUT} mysqladmin -uroot -p"${DB_ROOT_PASSWORD}" shutdown'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/kolla/config_files/mysql.json:/var/lib/kolla/config_files/config.json', u'/var/lib/config-data/puppet-generated/mysql/:/var/lib/kolla/config_files/src:ro', u'/var/lib/mysql:/var/lib/mysql'], u'net': u'host', u'detach': False}, u'haproxy_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos
0.000 | 5283: Nov 8 20:37:13 localhost os-collect-config: -binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'rabbitmq_image_tag': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u"/usr/bin/docker tag '192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' '192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:pcmklatest'"], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/dev/shm:/dev/shm:rw', u'/etc/sysconfig/docker:/etc/sysconfig/docker:ro', u'/usr/bin:/usr/bin:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw'], u'net': u'host', u'detach': False}, u'rabbitmq_bootstrap': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS', u'KOLLA_BOOTSTRAP=True', u'RABBITMQ_CLUSTER_COOKIE=6X4AV5uqrpsuRbEp23FL'], u'volumes': [u'/var/lib/kolla/config_files/rabbitmq.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/rabbitmq/:/var/lib/kolla/config_files/src:ro', u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/var/lib/rabbitmq:/var/lib/rabbitmq'], u'net': u'host', u'privileged': False}, u'memcached': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-memcached:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'
0.356 | 5284: Nov 8 20:37:13 localhost os-collect-config: command': [u'/bin/bash', u'-c', u'source /etc/sysconfig/memcached; /usr/bin/memcached -p ${PORT} -u ${USER} -m ${CACHESIZE} -c ${MAXCONN} $OPTIONS >> /var/log/memcached.log 2>&1'], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/config-data/memcached/etc/sysconfig/memcached:/etc/sysconfig/memcached:ro', u'/var/log/containers/memcached:/var/log/'], u'net': u'host', u'privileged': False, u'restart': u'always'}}})
0.000 | 5285: Nov 8 20:37:13 localhost os-collect-config: changed: [localhost] => (item={'key': u'step_3', 'value': {u'nova_placement': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-placement-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-placement:/var/log/httpd', u'/var/lib/kolla/config_files/nova_placement.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/nova_placement/:/var/lib/kolla/config_files/src:ro', u'', u''], u'net': u'host', u'restart': u'always'}, u'nova_db_sync': {u'start_order': 3, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage db sync'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-

0.000 | 5293: Nov 8 20:37:13 localhost os-collect-config: /containers/cinder:/var/log/cinder', u'/var/log/containers/httpd/cinder-api:/var/log/httpd'], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_api_map_cell0': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage cell_v2 map_cell0'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/nova/etc/nova/:/etc/nova/:ro'], u'net': u'host', u'detach': False}, u'glance_api_db_sync': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-glance-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_BOOTSTRAP=True', u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'command': u"/usr/bin/bootstrap_host_exec glance_api su glance -s /bin/bash -c '/usr/local/bin/kolla_start'", u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/glance:/var/log/glance', u
0.000 | 5294: Nov 8 20:37:13 localhost os-collect-config: '/var/log/containers/httpd/glance-api:/var/log/httpd', u'/var/lib/kolla/config_files/glance_api.json:/var/lib/kolla/config_files/config.json', u'/var/lib/config-data/puppet-generated/glance_api/:/var/lib/kolla/config_files/src:ro', u'/etc/ceph:/var/lib/kolla/config_files/src-ceph:ro', u''], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_api_create_default_cell': {u'start_order': 2, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': u"/usr/bin/bootstrap_host_exec nova_api su nova -s /bin/bash -c '/usr/bin/nova-manage cell_v2 create_cell --name=default'", u'exit_codes': [0, 2], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd', u'/var/lib/config-data/nova/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/nova/etc/nova/:/etc/nova/:ro'], u'net': u'host', u'detach': False, u'user': u'root'}, u'neutron_db_sync': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/usr/bin/bootstrap_host_exec', u'neutron_api', u'neutron-db-manage', u'upgrade', u'heads'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls
0.000 | 5295: Nov 8 20:37:13 localhost os-collect-config: /cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/log/containers/neutron:/var/log/neutron', u'/var/log/containers/httpd/neutron-api:/var/log/httpd', u'/var/lib/config-data/neutron/etc/my.cnf.d/tripleo.cnf:/etc/my.cnf.d/tripleo.cnf:ro', u'/var/lib/config-data/neutron/etc/neutron:/etc/neutron:ro', u'/var/lib/config-data/neutron/usr/share/neutron:/usr/share/neutron:ro'], u'net': u'host', u'detach': False, u'privileged': False}, u'nova_virtlogd': {u'start_order': 0, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-libvirt:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'pid': u'host', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'volumes': [u'/etc/hosts:/etc/hosts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/nova_libvirt/:/var/lib/kolla/config_files/src:ro', u'/lib/modules:/lib/modules:ro', u'/dev:/dev', u'/run:/run', u'/sys/fs/cgroup:/sys/fs/cgroup', u'/var/lib/nova:/var/lib/nova:shared', u'/var/run/libvirt:/var/run/libvirt', u'/var/lib/libvirt:/var/lib/libvirt', u'/etc/libvirt/qemu:/etc/libvirt/qemu:ro', u'/var/log/libvirt/qemu:/var/log/libvirt/qemu'], u'net': u'host', u'privileged': True, u'restart': u'always'}, u'sensu_client': {u'healthcheck': {u'test': u'/openstack/healthcheck'}, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-sensu-client:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRATEGY=COPY_ALWAYS'], u'user': u'root', u'volumes': [u'/etc/hosts:/etc/ho
0.265 | 5296: Nov 8 20:37:13 localhost os-collect-config: sts:ro', u'/etc/localtime:/etc/localtime:ro', u'/etc/puppet:/etc/puppet:ro', u'/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', u'/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', u'/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', u'/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', u'/dev/log:/dev/log', u'/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', u'/var/run/docker.sock:/var/run/docker.sock:rw', u'/var/lib/kolla/config_files/sensu-client.json:/var/lib/kolla/config_files/config.json:ro', u'/var/lib/config-data/puppet-generated/sensu/:/var/lib/kolla/config_files/src:ro', u'/var/log/containers/sensu:/var/log/sensu:rw'], u'net': u'host', u'privileged': True, u'restart': u'always'}, u'keystone_bootstrap': {u'action': u'exec', u'start_order': 3, u'command': [u'keystone', u'/usr/bin/bootstrap_host_exec', u'keystone', u'keystone-manage', u'bootstrap', u'--bootstrap-password', u'fkUCscvWTfR2ydGbkvyzg8Tas'], u'user': u'root'}}})
0.000 | 5297: Nov 8 20:37:13 localhost os-collect-config: changed: [localhost] => (item={'key': u'step_2', 'value': {u'gnocchi_init_log': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R gnocchi:gnocchi /var/log/gnocchi'], u'user': u'root', u'volumes': [u'/var/log/containers/gnocchi:/var/log/gnocchi', u'/var/log/containers/httpd/gnocchi-api:/var/log/httpd']}, u'cinder_scheduler_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-cinder-scheduler:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R cinder:cinder /var/log/cinder'], u'privileged': False, u'volumes': [u'/var/log/containers/cinder:/var/log/cinder'], u'user': u'root'}, u'neutron_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R neutron:neutron /var/log/neutron'], u'privileged': False, u'volumes': [u'/var/log/containers/neutron:/var/log/neutron', u'/var/log/containers/httpd/neutron-api:/var/log/httpd'], u'user': u'root'}, u'nova_api_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-nova-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R nova:nova /var/log/nova'], u'privileged': False, u'volumes': [u'/var/log/containers/nova:/var/log/nova', u'/var/log/containers/httpd/nova-api:/var/log/httpd'], u'user': u'root'}, u'congress_init_logs': {u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-congress-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'command': [u'/bin/bash', u'-c', u'chown -R congress:congress /var/log/congress'], u'privileged': False, u'volumes': [u'/var/log/containers/congress:/var/log/congress'], u'user': u'root'}, u'clustercheck': {u'start_order': 1, u'image': u'192.168.24.1:8787/tripleomaster/centos-binary-mariadb:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d', u'environment': [u'KOLLA_CONFIG_STRAT

0.002 | 12566: Nov 8 20:44:28 localhost os-collect-config: g: hiera(): Looking up ntp::service_name in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::service_provider in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::stepout in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tinker in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos_minclock in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos_minsane in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos_floor in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos_ceiling in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::tos_cohort in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::udlc in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::udlc_stratum in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::ntpsigndsocket in JSON backend\",
0.002 | 12566: \"Debug: hiera(): Looking up ntp::authprov in JSON backend\",
0.002 | 12566: \"Debug: importing '/etc/puppet/modules/ntp/manifests/install.pp' in environment production\",
0.002 | 12566: \"Debug: Automatically imported ntp::install from ntp/install into production\",
0.002 | 12566: \"Debug: importing '/etc/puppet/modules/ntp/manifests/config.pp' in environment production\",
0.002 | 12566: \"Debug: Automatically imported ntp::config from ntp/config into production\",
0.002 | 12566: \"Debug: Scope(Class[Ntp::Config]): Retrieving template ntp/ntp.conf.erb\",
0.002 | 12566: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Bound template variables for /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.002 | 12566: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Interpolated template /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.002 | 12566: \"Debug: importing '/etc/puppet/modules/ntp/manifests/service.pp' in environment production\",
0.002 | 12566: \"Debug: Automatically imported ntp::service from ntp/service into production\",
0.002 | 12566: \"Debug: import
0.001 | 12567: Nov 8 20:44:28 localhost os-collect-config: ing '/etc/puppet/modules/tripleo/manifests/profile/base/snmp.pp' in environment production\",
0.001 | 12567: \"Debug: Automatically imported tripleo::profile::base::snmp from tripleo/profile/base/snmp into production\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_config in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_password in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_user in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::step in JSON backend\",
0.001 | 12567: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.001 | 12567: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.001 | 12567: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.001 | 12567: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.001 | 12567: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.001 | 12567: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.001 | 12567: \"Debug: Automatically imported ssh::params from ssh/params into production\",
0.001 | 12567: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.001 | 12567: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.005 | 12568: Nov 8 20:44:28 localhost os-collect-config:
0.005 | 12568: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.005 | 12568: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.005 | 12568: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.005 | 12568: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.005 | 12568: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.005 | 12568: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.005 | 12568: \"Debug: Automatically imported concat from concat into production\",
0.005 | 12568: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.005 | 12568: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.005 | 12568: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.005 | 12568: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.005 | 12568: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/service.pp' in environment production\",
0.005 | 12568: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.005 | 12568: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.005 | 12568: \"Debug: hiera(): Looki
0.220 | 12569: Nov 8 20:44:28 localhost os-collect-config: ng up ssh::server::service::enable in JSON backend\",
0.220 | 12569: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.220 | 12569: \"Debug: Automatically imported timezone from timezone into production\",
0.220 | 12569: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.220 | 12569: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.220 | 12569: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.220 | 12569: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.220 | 12569: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.220 | 12569: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.220 | 12569: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.220 | 12569: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.220 | 12569: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.220 | 12569: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.220 | 12569: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.220 | 12569: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.220 | 12569: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.220 | 12569: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/profile/base/cinder into production\",
0.220 | 12569: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.220 | 12569: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.220 | 12569: \"Debug: hiera(): Looking up tripleo::profile::base
0.074 | 12570: Nov 8 20:44:28 localhost os-collect-config: ::cinder::step in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up messaging_notify_service_name in JSON backend\",
0.074 | 12570: \"Debug: hiera(): Looking up cinder::rabbit_use_ssl in JSON backend\",
0.074 | 12570: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production\",
0.074 | 12570: \"Debug: Auto

0.000 | 12715: Nov 8 20:44:28 localhost os-collect-config: ss\",
0.000 | 12715: \"2017-11-08 20:42:53,913 INFO: 15363 -- Finished processing puppet configs for congress\",
0.000 | 12715: \"2017-11-08 20:42:53,914 INFO: 15363 -- Starting configuration of ceilometer using image 192.168.24.1:8787/tripleomaster/centos-binary-ceilometer-central:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:42:53,914 INFO: 15363 -- Removing container: docker-puppet-ceilometer\",
0.000 | 12715: \"2017-11-08 20:42:53,944 INFO: 15363 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-ceilometer-central:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:42:56,462 INFO: 15361 -- Removing container: docker-puppet-heat_api_cfn\",
0.000 | 12715: \"2017-11-08 20:42:56,520 INFO: 15361 -- Finished processing puppet configs for heat_api_cfn\",
0.000 | 12715: \"2017-11-08 20:42:56,521 INFO: 15361 -- Starting configuration of haproxy using image 192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:42:56,521 INFO: 15361 -- Removing container: docker-puppet-haproxy\",
0.000 | 12715: \"2017-11-08 20:42:56,554 INFO: 15361 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-haproxy:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:42:57,738 INFO: 15362 -- Removing container: docker-puppet-heat_api\",
0.000 | 12715: \"2017-11-08 20:42:57,793 INFO: 15362 -- Finished processing puppet configs for heat_api\",
0.000 | 12715: \"2017-11-08 20:42:57,793 INFO: 15362 -- Starting configuration of neutron using image 192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:42:57,794 INFO: 15362 -- Removing container: docker-puppet-neutron\",
0.000 | 12715: \"2017-11-08 20:42:57,821 INFO: 15362 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-neutron-server:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12715: \"2017-11-08 20:43:15,436 INFO: 15363 -- Removing container: do
0.000 | 12716: Nov 8 20:44:28 localhost os-collect-config: cker-puppet-ceilometer\",
0.000 | 12716: \"2017-11-08 20:43:15,477 INFO: 15363 -- Finished processing puppet configs for ceilometer\",
0.000 | 12716: \"2017-11-08 20:43:15,477 INFO: 15363 -- Starting configuration of rabbitmq using image 192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12716: \"2017-11-08 20:43:15,478 INFO: 15363 -- Removing container: docker-puppet-rabbitmq\",
0.000 | 12716: \"2017-11-08 20:43:15,507 INFO: 15363 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-rabbitmq:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12716: \"2017-11-08 20:43:20,586 INFO: 15361 -- Removing container: docker-puppet-haproxy\",
0.000 | 12716: \"2017-11-08 20:43:20,629 INFO: 15361 -- Finished processing puppet configs for haproxy\",
0.000 | 12716: \"2017-11-08 20:43:25,073 INFO: 15362 -- Removing container: docker-puppet-neutron\",
0.000 | 12716: \"2017-11-08 20:43:25,111 INFO: 15362 -- Finished processing puppet configs for neutron\",
0.000 | 12716: \"2017-11-08 20:43:25,111 INFO: 15362 -- Starting configuration of cinder using image 192.168.24.1:8787/tripleomaster/centos-binary-cinder-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12716: \"2017-11-08 20:43:25,112 INFO: 15362 -- Removing container: docker-puppet-cinder\",
0.000 | 12716: \"2017-11-08 20:43:25,138 INFO: 15362 -- Pulling image: 192.168.24.1:8787/tripleomaster/centos-binary-cinder-api:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d\",
0.000 | 12716: \"2017-11-08 20:43:44,298 INFO: 15363 -- Removing container: docker-puppet-rabbitmq\",
0.000 | 12716: \"2017-11-08 20:43:44,338 INFO: 15363 -- Finished processing puppet configs for rabbitmq\",
0.000 | 12716: \"2017-11-08 20:44:01,362 INFO: 15362 -- Removing container: docker-puppet-cinder\",
0.000 | 12716: \"2017-11-08 20:44:01,414 INFO: 15362 -- Finished processing puppet configs for cinder\"
0.000 | 12716: ],
0.000 | 12716: \"failed_when_result\": false
0.000 | 12716: }
0.000 | 12716:
0.000 | 12716: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-startup-config-step_1.json exists] ***
0.000 | 12716: ok: [localhost]
0.000 | 12716:
0.000 | 12716: TASK
0.000 | 12717: Nov 8 20:44:28 localhost os-collect-config: [Start containers for step 1] *********************************************
0.000 | 12717: ok: [localhost]
0.000 | 12717:
0.000 | 12717: TASK [debug] *******************************************************************
0.000 | 12717: ok: [localhost] => {
0.000 | 12717: \"(outputs.stderr|default('')).split('\
0.000 | 12717: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 12717: \"stdout: \",
0.000 | 12717: \"stderr: \",
0.000 | 12717: \"stdout: 4f06c7de1a596d6b7a8450aa433baa20188c9b81a7813320fbb4c2225052e14d\",
0.000 | 12717: \"\",
0.000 | 12717: \"stdout: Installing MariaDB/MySQL system tables in '/var/lib/mysql' ...\",
0.000 | 12717: \"OK\",
0.000 | 12717: \"Filling help tables...\",
0.000 | 12717: \"Creating OpenGIS required SP-s...\",
0.000 | 12717: \"To start mysqld at boot time you have to copy\",
0.000 | 12717: \"support-files/mysql.server to the right place for your system\",
0.000 | 12717: \"PLEASE REMEMBER TO SET A PASSWORD FOR THE MariaDB root USER !\",
0.000 | 12717: \"To do so, start the server, then issue the following commands:\",
0.000 | 12717: \"'/usr/bin/mysqladmin' -u root password 'new-password'\",
0.000 | 12717: \"'/usr/bin/mysqladmin' -u root -h centos-7-rax-iad-0000787869 password 'new-password'\",
0.000 | 12717: \"Alternatively you can run:\",
0.000 | 12717: \"'/usr/bin/mysql_secure_installation'\",
0.000 | 12717: \"which will also give you the option of removing the test\",
0.000 | 12717: \"databases and anonymous user created by default. This is\",
0.000 | 12717: \"strongly recommended for production servers.\",
0.000 | 12717: \"See the MariaDB Knowledgebase at http://mariadb.com/kb or the\",
0.000 | 12717: \"MySQL manual for more instructions.\",
0.000 | 12717: \"You can start the MariaDB daemon with:\",
0.000 | 12717: \"cd '/usr' ; /usr/bin/mysqld_safe --datadir='/var/lib/mysql'\",
0.000 | 12717: \"You can test the MariaDB daemon with mysql-test-run.pl\",
0.000 | 12717: \"cd '/usr/mysql-test' ; perl mysql-test-run.pl\",
0.000 | 12717: \"Please report any problems at http://mariadb.org/jira\",
0.000 | 12717: \"The latest information about MariaDB is available at http://mariadb.org/.\",
0.000 | 12717: \"You can find additional information about the MySQL part at:\",
0.000 | 12717: \"http://dev.mysql.c
0.427 | 12718: Nov 8 20:44:28 localhost os-collect-config: om\",
0.427 | 12718: \"Consider joining MariaDB's strong and vibrant community:\",
0.427 | 12718: \"https://mariadb.org/get-involved/\",
0.427 | 12718: \"171108 20:44:18 mysqld_safe Logging to '/var/log/mariadb/mariadb.log'.\",
0.427 | 12718: \"171108 20:44:18 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql\",
0.427 | 12718: \"spawn mysql_secure_installation\\r\",
0.427 | 12718: \"\\r\",
0.427 | 12718: \"NOTE: RUNNING ALL PARTS OF THIS SCRIPT IS RECOMMENDED FOR ALL MariaDB\\r\",
0.427 | 12718: \" SERVERS IN PRODUCTION USE! PLEASE READ EACH STEP CAREFULLY!\\r\",
0.427 | 12718: \"In order to log into MariaDB to secure it, we'll need the current\\r\",
0.427 | 12718: \"password for the root user. If you've just installed MariaDB, and\\r\",
0.427 | 12718: \"you haven't set the root password yet, the password will be blank,\\r\",
0.427 | 12718: \"so you should just press enter here.\\r\",
0.427 | 12718: \"Enter current password for root (enter for none): \\r\",
0.427 | 12718: \"OK, successfully used password, moving on...\\r\",
0.427 | 12718: \"Setting the root password ensures that nobody can log into the MariaDB\\r\",
0.427 | 12718: \"root user without the proper authorisation.\\r\",
0.427 | 12718: \"Set root password? [Y/n] y\\r\",
0.427 | 12718: \"New password: \\r\",
0.427 | 12718: \"Re-enter new password: \\r\",
0.427 | 12718: \"Password updated successfully!\\r\",
0.427 | 12718: \"Reloading privilege tables..\\r\",
0.427 | 12718: \" ... Success!\\r\",
0.427 | 12718: \"By default, a MariaDB installation has an anonymous user, allowing anyone\\r\",
0.427 | 12718: \"to log into MariaDB without having to have a user account created for\\r\",
0.427 | 12718: \"them. This is intended only for testing, and to make the installation\\r\",
0.427 | 12718: \"go a bit smoother. You should remove them before moving into a\\r\",
0.427 | 12718: \"production environment.\\r\",
0.427 | 12718: \"Remove anonymous users? [Y/n] y\\r\",
0.427 | 12718: \"Normally, root should only be allowed to connect from 'localhost'. This\\r\",
0.427 | 12718: \"ensures that someone cannot guess at the root password from the network.\\r\",
0.427 | 12718: \"Disallow root l
0.471 | 12719: Nov 8 20:44:28 localhost os-collect-config: ogin remotely? [Y/n] n\\r\",
0.471 | 12719: \" ... skipping.\\r\",
0.471 | 12719: \"By default, MariaDB comes with a database named 'test' that anyone can\\r\",
0.471 | 12719: \"access. This is also intended only for testing, and should be removed\\r\",
0.471 | 12719: \"before moving into a production environment.\\r\",
0.471 | 12719: \"Remove test database and access to it? [Y/n] y\\r\",
0.471 | 12719: \" - Dropping test database...\\r\",
0.471 | 12719: \" - Removing privileges on test database...\\r\",
0.471 | 12719: \"Reloading the privilege tables will ensure that all changes made so far\\r\",
0.471 | 12719: \"will take effect immediately.\\r\",
0.471 | 12719: \"Reload privilege tables now? [Y/n] y\\r\",
0.471 | 12719: \"Cleaning up...\\r\",
0.471 | 12719: \"All done! If you've completed all of the above steps, your MariaDB\\r\",
0.471 | 12719: \"installation should now be secure.\\r\",
0.471 | 12719: \"Thanks for using MariaDB!\\r\",
0.471 | 12719: \"171108 20:44:21 mysqld_safe mysqld from pid file /var/lib/mysql/mariadb.pid ended\",
0.471 | 12719: \"171108 20:44:22 mysqld_safe Logging to '/var/log/mariadb/mariadb.log'.\",
0.471 | 12719: \"171108 20:44:22 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql\",
0.471 | 12719: \"mysqld is alive\",
0.471 | 12719: \"171108 20:44:25 mysqld_safe mysqld from pid file /var/lib/mysql/mariadb.pid ended\",
0.471 | 12719: \"stderr: + '[' -e /var/lib/mysql/mysql ']'\",
0.471 | 12719: \"+ echo -e '\\\
0.471 | 12719: [mysqld]\\\
0.471 | 12719: wsrep_provider=none'\",
0.471 | 12719: \"+ sudo -u mysql -E kolla_start\",
0.471 | 12719: \"+ sudo -E kolla_set_configs\",
0.471 | 12719: \"INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json\",
0.471 | 12719: \"INFO:__main__:Validating config file\",
0.471 | 12719: \"INFO:__main__:Kolla config strategy set to: COPY_ALWAYS\",
0.471 | 12719: \"INFO:__main__:Copying service configuration files\",
0.471 | 12719: \"INFO:__main__:Copying /dev/null to /etc/libqb/force-filesystem-sockets\",
0.471 | 12719: \"INFO:__main__:Setting permission for /etc/libqb/force-filesystem-sockets\",
0.471 | 12719: \"INFO:__main__:Deleting /etc/my.cnf.d/galera.cnf\",
0.471 | 12719: \"INFO:__main
0.183 | 12720: Nov 8 20:44:28 localhost os-collect-config: __:Copying /var/lib/kolla/config_files/src/etc/my.cnf.d/galera.cnf to /etc/my.cnf.d/galera.cnf\",
0.183 | 12720: \"INFO:__main__:Copying /var/lib/kolla/config_files/src/etc/sysconfig/clustercheck to /etc/sysconfig/clustercheck\",
0.183 | 12720: \"INFO:__main__:Copying /var/lib/kolla/config_files/src/root/.my.cnf to /root/.my.cnf\",
0.183 | 12720: \"INFO:__main__:Writing out command to execute\",
0.183 | 12720: \"++ cat /run_command\",
0.183 | 12720: \"+ CMD=/usr/sbin/pacemaker_remoted\",
0.183 | 12720: \"+ ARGS=\",
0.183 | 12720: \"+ [[ ! -n '' ]]\",
0.183 | 12720: \"+ . kolla_extend_start\",
0.183 | 12720: \"++ [[ ! -d /var/log/kolla/mariadb ]]\",
0.183 | 12720: \"++ mkdir -p /var/log/kolla/mariadb\",
0.183 | 12720: \"+++ stat -c %a /var/log/kolla/mariadb\",
0.183 | 12720: \"++ [[ 2755 != \\\\7\\\\5\\\\5 ]]\",
0.183 | 12720: \"++ chmod 755 /var/log/kolla/mariadb\",
0.183 | 12720: \"++ [[ -n 0 ]]\",
0.183 | 12720: \"++ mysql_install_db\",
0.183 | 12720: \"2017-11-08 20:44:05 140530918357184 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.183 | 12720: \"2017-11-08 20:44:05 140530918357184 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 46 ...\",
0.183 | 12720: \"2017-11-08 20:44:10 140217886963904 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.183 | 12720: \"2017-11-08 20:44:10 140217886963904 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 75 ...\",
0.183 | 12720: \"2017-11-08 20:44:14 140639425128640 [Warning] option 'open_files_limit': unsigned value 18446744073709551615 adjusted to 4294967295\",
0.183 | 12720: \"2017-11-08 20:44:14 140639425128640 [Note] /usr/libexec/mysqld (mysqld 10.1.20-MariaDB) starting as process 105 ...\",
0.183 | 12720: \"++ bootstrap_db\",
0.183 | 12720: \"++ TIMEOUT=60\",
0.183 | 12720: \"++ mysqld_safe --wsrep-new-cluster --skip-networking --wsrep-on=OFF --pid-file=/var/lib/mysql/mariadb.pid\",
0.183 | 12720: \"++ [[ ! -S /var/lib/mysql/mysql.sock ]]\",
0.183 | 12720: \"++ [[ ! -S /var/run/mysqld/mysqld.sock ]]\",
0.183 | 12720: \"++ [[ 60 -gt 0 ]]\",
0.183 | 12720: \"

0.062 | 13280: Nov 8 20:44:28 localhost os-collect-config: "Debug: hiera(): Looking up cinder_backup_short_bootstrap_node_name in JSON backend",
0.109 | 13281: Nov 8 20:44:28 localhost os-collect-config: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/volume.pp' in environment production",
0.132 | 13282: Nov 8 20:44:28 localhost os-collect-config: "Debug: Automatically imported tripleo::profile::pacemaker::cinder::volume from tripleo/profile/pacemaker/cinder/volume into production",
0.220 | 13283: Nov 8 20:44:28 localhost os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::bootstrap_node in JSON backend",
0.125 | 13284: Nov 8 20:44:28 localhost os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::step in JSON backend",

0.000 | 16127: Nov 8 20:48:30 localhost pengine[12541]: notice: Calculated transition 3, saving inputs in /var/lib/pacemaker/pengine/pe-input-3.bz2
0.000 | 16128: Nov 8 20:48:30 localhost crmd[12542]: notice: Transition 3 (Complete=0, Pending=0, Fired=0, Skipped=0, Incomplete=0, Source=/var/lib/pacemaker/pengine/pe-input-3.bz2): Complete
0.000 | 16129: Nov 8 20:48:30 localhost crmd[12542]: notice: State transition S_TRANSITION_ENGINE -> S_IDLE
0.200 | 16130: Nov 8 20:48:30 localhost puppet-user[43039]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]/ensure) created
0.000 | 16131: Nov 8 20:48:33 localhost crmd[12542]: notice: State transition S_IDLE -> S_POLICY_ENGINE

0.027 | 20270: Nov 8 20:53:19 localhost os-collect-config: "Debug: try 1/20: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-43039-1p7igq3 property set --node centos-7-rax-iad-0000787869 cinder-backup-role=true",
0.000 | 20271: Nov 8 20:53:19 localhost os-collect-config: "Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-43039-1p7igq3 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-43039-1p7igq3.orig returned 0 -> CIB updated",
0.029 | 20272: Nov 8 20:53:19 localhost os-collect-config: "Debug: property create: property set --node centos-7-rax-iad-0000787869 cinder-backup-role=true -> ",
0.206 | 20273: Nov 8 20:53:19 localhost os-collect-config: "Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]/ensure: created",
0.169 | 20274: Nov 8 20:53:19 localhost os-collect-config: "Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Property[cinder-backup-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-backup-role]: The container Pacemaker::Property[cinder-backup-role-node-property] will propagate my refresh event",
0.238 | 20275: Nov 8 20:53:19 localhost os-collect-config: "Debug: Pacemaker::Property[cinder-backup-role-node-property]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Backup] will propagate my refresh event",
0.248 | 20276: Nov 8 20:53:19 localhost os-collect-config: "Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Backup]: The container Stage[main] will propagate my refresh event",
0.000 | 20277: Nov 8 20:53:19 localhost os-collect-config: "Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-43039-13u66pr returned ",

0.028 | 20283: Nov 8 20:53:19 localhost os-collect-config: "Debug: property create: property set --node centos-7-rax-iad-0000787869 cinder-volume-role=true -> ",
0.189 | 20284: Nov 8 20:53:19 localhost os-collect-config: "Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Property[cinder-volume-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-volume-role]/ensure: created",
0.157 | 20285: Nov 8 20:53:19 localhost os-collect-config: "Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Property[cinder-volume-role-node-property]/Pcmk_property[property-centos-7-rax-iad-0000787869-cinder-volume-role]: The container Pacemaker::Property[cinder-volume-role-node-property] will propagate my refresh event",
0.221 | 20286: Nov 8 20:53:19 localhost os-collect-config: "Debug: Pacemaker::Property[cinder-volume-role-node-property]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Volume] will propagate my refresh event",
0.230 | 20287: Nov 8 20:53:19 localhost os-collect-config: "Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Volume]: The container Stage[main] will propagate my refresh event",
0.000 | 20288: Nov 8 20:53:19 localhost os-collect-config: "Debug: Prefetching iptables resources for firewall",

0.000 | 21062: Nov 8 20:54:36 centos-7-rax-iad-0000787869 pacemaker_remoted[13]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)
0.000 | 21063: Nov 8 20:54:36 centos-7-rax-iad-0000787869 yum[66409]: Installed: python2-pyasn1-modules-0.1.9-7.el7.noarch
0.000 | 21064: Nov 8 20:54:36 centos-7-rax-iad-0000787869 pacemaker_remoted[13]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)
0.202 | 21065: Nov 8 20:54:36 centos-7-rax-iad-0000787869 yum[66409]: Installed: python-kmod-0.9-4.el7.x86_64
0.000 | 21066: Nov 8 20:54:36 centos-7-rax-iad-0000787869 pacemaker_remoted[12]: notice: Watchdog may be enabled but stonith-watchdog-timeout is disabled: (null)

0.000 | 21133: Nov 8 20:54:42 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 21134: Nov 8 20:54:42 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 21135: Nov 8 20:54:42 centos-7-rax-iad-0000787869 yum[66409]: Installed: 1:openstack-cinder-12.0.0-0.20171107135501.fb27334.el7.centos.noarch
0.334 | 21136: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]/ensure) created
0.247 | 21137: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[cinder::install::end]
0.271 | 21138: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[cinder::service::end]
0.264 | 21139: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Package[cinder]) Scheduling refresh of Anchor[keystone::service::end]
0.008 | 21140: Nov 8 20:54:43 centos-7-rax-iad-0000787869 journal: Suppressed 12713 messages from /system.slice/os-collect-config.service
0.158 | 21141: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::install::end]) Triggered 'refresh' from 1 events
0.185 | 21142: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::install::end]) Scheduling refresh of Anchor[cinder::service::begin]
0.011 | 21143: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure) created
0.233 | 21144: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]) Scheduling refresh of Anchor[cinder::config::end]
0.011 | 21145: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure) created
0.231 | 21146: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]) Scheduling refresh of Anchor[cinder::config::end]
0.011 | 21147: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure) created
0.230 | 21148: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]) Scheduling refresh of Anchor[cinder::config::end]
0.014 | 21149: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/host]/ensure) created
0.286 | 21150: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/host]) Scheduling refresh of Anchor[cinder::config::end]
0.008 | 21151: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure) created
0.188 | 21152: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]) Scheduling refresh of Anchor[cinder::config::end]
0.009 | 21153: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure) created
0.203 | 21154: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]) Scheduling refresh of Anchor[cinder::config::end]
0.009 | 21155: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure) created
0.203 | 21156: Nov 8 20:54:43 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]) Scheduling refresh of Anchor[cinder::config::end]
0.000 | 21157: Nov 8 20:54:46 centos-7-rax-iad-0000787869 su: (to rabbitmq) root on none

0.008 | 21196: Nov 8 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]/ensure) created
0.183 | 21197: Nov 8 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]) Scheduling refresh of Anchor[cinder::config::end]
0.009 | 21198: Nov 8 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]/ensure) created
0.204 | 21199: Nov 8 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]) Scheduling refresh of Anchor[cinder::config::end]
0.008 | 21200: Nov 8 20:54:51 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]/ensure) created

0.006 | 21220: Nov 8 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/ensure) created
0.147 | 21221: Nov 8 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]) Scheduling refresh of Anchor[cinder::config::end]
0.178 | 21222: Nov 8 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::config::end]) Triggered 'refresh' from 22 events
0.204 | 21223: Nov 8 20:54:52 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Cinder::Deps/Anchor[cinder::config::end]) Scheduling refresh of Anchor[cinder::service::begin]
0.009 | 21224: Nov 8 20:54:54 centos-7-rax-iad-0000787869 puppet-user[64795]: (/Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]/ensure) created

0.000 | 21580: Nov 8 20:55:35 centos-7-rax-iad-0000787869 dockerd-current: time="2017-11-08T20:55:35.561572212Z" level=info msg="{Action=create, LoginUID=4294967295, PID=71351}"
0.035 | 21581: Nov 8 20:55:35 centos-7-rax-iad-0000787869 journal: + CMD='/usr/sbin/collectd -f'
0.043 | 21582: Nov 8 20:55:35 centos-7-rax-iad-0000787869 journal: + ARGS=
0.353 | 21583: Nov 8 20:55:35 centos-7-rax-iad-0000787869 journal: + [[ ! -n '' ]]
0.043 | 21584: Nov 8 20:55:35 centos-7-rax-iad-0000787869 journal: + . kolla_extend_start

0.024 | 24517: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: hiera(): Looking up ntp::servers in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::service_enable in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::service_ensure in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::service_manage in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::service_name in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::service_provider in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::stepout in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tinker in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos_minclock in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos_minsane in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos_floor in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos_ceiling in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::tos_cohort in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::udlc in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::udlc_stratum in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::ntpsigndsocket in JSON backend\",
0.024 | 24517: \"Debug: hiera(): Looking up ntp::authprov in JSON backend\",
0.024 | 24517: \"Debug: importing '/etc/puppet/modules/ntp/manifests/install.pp' in environment production\",
0.024 | 24517: \"Debug: Automatically imported ntp::install from ntp/install into production\",
0.024 | 24517: \"Debug: importing '/etc/puppet/modules/ntp/manifests/config.pp' in environment production\",
0.024 | 24517: \"Debug: Automatically imported ntp::config from ntp/config into production\",
0.024 | 24517: \"Debug: Scope(Class[Ntp::Config]): Retrieving template ntp/ntp.conf.erb\",
0.024 | 24517: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.erb]: Bound template variables for /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.024 | 24517: \"Debug: template[/etc/puppet/modules/ntp/templates/ntp.conf.e
0.025 | 24518: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: rb]: Interpolated template /etc/puppet/modules/ntp/templates/ntp.conf.erb in 0.00 seconds\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/ntp/manifests/service.pp' in environment production\",
0.025 | 24518: \"Debug: Automatically imported ntp::service from ntp/service into production\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/snmp.pp' in environment production\",
0.025 | 24518: \"Debug: Automatically imported tripleo::profile::base::snmp from tripleo/profile/base/snmp into production\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_config in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_password in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::snmpd_user in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::snmp::step in JSON backend\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.025 | 24518: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.025 | 24518: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.025 | 24518: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.025 | 24518: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.025 | 24518: \"Debug: Automa
0.019 | 24519: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: tically imported ssh::params from ssh/params into production\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.019 | 24519: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.019 | 24519: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.019 | 24519: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.019 | 24519: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.019 | 24519: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.019 | 24519: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.019 | 24519: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.019 | 24519: \"Debug: Automatically imported concat from concat into production\",
0.019 | 24519: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.019 | 24519: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.019 | 24519: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.019 | 24519: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.019 | 24519: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.019 | 24519: \"Debug: impor
0.202 | 24520: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ting '/etc/puppet/modules/ssh/manifests/server/service.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.202 | 24520: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.202 | 24520: \"Debug: hiera(): Looking up ssh::server::service::enable in JSON backend\",
0.202 | 24520: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported timezone from timezone into production\",
0.202 | 24520: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.202 | 24520: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.202 | 24520: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.202 | 24520: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.202 | 24520: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.202 | 24520: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.202 | 24520: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.202 | 24520: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.202 | 24520: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.202 | 24520: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.202 | 24520: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/prof
0.062 | 24521: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ile/base/cinder into production\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::step in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.062 | 24521: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.062 | 24521: \

0.002 | 24694: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: and 'set' with params [\\\"/files/etc/sysconfig/docker/OPTIONS\\\", \\\"\\\\\\\"--log-driver=journald --signature-verification=false --iptables=false\\\\\\\"\\\"]\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-options](provider=augeas): Skipping because no files were changed\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-options](provider=augeas): Closed the augeas connection\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Augeas version 1.4.0 is installed\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Will attempt to save and only run if files changed\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): sending command 'set' with params [\\\"/files/etc/sysconfig/docker/INSECURE_REGISTRY\\\", \\\"\\\\\\\"--insecure-registry 192.168.24.1:8787\\\\\\\"\\\"]\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Skipping because no files were changed\",
0.002 | 24694: \"Debug: Augeas[docker-sysconfig-registry](provider=augeas): Closed the augeas connection\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): Augeas version 1.4.0 is installed\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): Will attempt to save and only run if files changed\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'rm' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\\\\\"registry-mirrors\\\\\\\"]\\\"]\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'set' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\\\\\"debug\\\\\\\"]\\\", \\\"debug\\\"]\",
0.002 | 24694: \"Debug: Augeas[docker-daemon.json](provider=augeas): sending command 'set' with params [\\\"/files/etc/docker/daemon.json/dict/entry[. = \\\
0.007 | 24695: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: \\\\"debug\\\\\\\"]/const\\\", \\\"false\\\"]\",
0.007 | 24695: \"Debug: Augeas[docker-daemon.json](provider=augeas): Skipping because no files were changed\",
0.007 | 24695: \"Debug: Augeas[docker-daemon.json](provider=augeas): Closed the augeas connection\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Augeas version 1.4.0 is installed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Will attempt to save and only run if files changed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): sending command 'set' with params [\\\"/files/etc/sysconfig/docker-storage/DOCKER_STORAGE_OPTIONS\\\", \\\"\\\\\\\" -s overlay2\\\\\\\"\\\"]\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Skipping because no files were changed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-storage](provider=augeas): Closed the augeas connection\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Augeas version 1.4.0 is installed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Will attempt to save and only run if files changed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): sending command 'rm' with params [\\\"/files/etc/sysconfig/docker-network/DOCKER_NETWORK_OPTIONS\\\"]\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Skipping because no files were changed\",
0.007 | 24695: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Closed the augeas connection\",
0.007 | 24695: \"Debug: Executing: '/usr/bin/systemctl is-active docker'\",
0.007 | 24695: \"Debug: Executing: '/usr/bin/systemctl is-enabled docker'\",
0.007 | 24695: \"Debug: Exec[directory-create-etc-my.cnf.d](provider=posix): Executing check 'test -d /etc/my.cnf.d'\",
0.007 | 24695: \"Debu
0.008 | 24696: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: g: Executing: 'test -d /etc/my.cnf.d'\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Augeas version 1.4.0 is installed\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Will attempt to save and only run if files changed\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'set' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/bind-address\\\", \\\"192.168.24.15\\\"]\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl\\\"]\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl-ca\\\"]\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Skipping because no files were changed\",
0.008 | 24696: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Closed the augeas connection\",
0.008 | 24696: \"Debug: Executing: '/usr/bin/systemctl is-active pcsd'\",
0.008 | 24696: \"Debug: Executing: '/usr/bin/systemctl is-enabled pcsd'\",
0.008 | 24696: \"Debug: Exec[Create Cluster tripleo_cluster](provider=posix): Executing check '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.008 | 24696: \"Debug: Executing: '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.008 | 24696: \"Debug: Exec[Start Cluster tripleo_cluster](provider=posix): Executing check '/sbin/pcs status >/dev/null 2>&1'\",
0.008 | 24696: \"Debug: Executing: '/sbin/pcs status >/dev/null 2>&1'\",
0.008 | 24696: \"Debug: Executing: '/usr/bin/systemctl is-enabled corosync'\",
0.008 | 24696: \"Debug: Executing: '/usr/bin/systemctl is-enabled pacemaker'\",
0.008 | 24696: \"Debug: Exec[wait-for-settle](provider=posix): Executing check '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.008 | 24696: \"Debug: Executing: '/sbin/pcs status | grep -q 'partition
0.250 | 24697: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: with quorum' > /dev/null 2>&1'\",
0.250 | 24697: \"Debug: defaults exists resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.250 | 24697: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1rl6m0z returned \",
0.250 | 24697: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1rl6m0z resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/systemctl is-active chronyd'\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/systemctl is-enabled chronyd'\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/systemctl is-active ntpd'\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/systemctl is-enabled ntpd'\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/rpm -q openstack-cinder --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.250 | 24697: '\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/rpm -q openstack-cinder --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.250 | 24697: --whatprovides'\",
0.250 | 24697: \"Debug: Package[cinder](provider=yum): Ensuring => present\",
0.250 | 24697: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install openstack-cinder'\",
0.250 | 24697: \"Notice: /Stage[main]/Cinder/Package[cinder]/ensure: created\",
0.250 | 24697: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[cinder::install::end]\",
0.250 | 24697: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[cinder::service::end]\",
0.250 | 24697: \"Info: /Stage[main]/Cinder/Package[cinder]: Scheduling refresh of Anchor[keystone::service::end]\",
0.250 | 24697: \"Debug: /Stage[main]/Cinder/Package[cinder]: The container Class[Cinder] will propagate my refresh event\",
0.250 | 24697: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::install::end]: Triggered 'refresh' from 1 events\",
0.250 | 24697: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::install::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.250 | 24697: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::install:
0.272 | 24698: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: :end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.272 | 24698: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]/ensure: created\",
0.272 | 24698: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: Scheduling refresh of Anchor[cinder::config::end]\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/api_paste_config]: The container Class[Cinder] will propagate my refresh event\",
0.272 | 24698: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]/ensure: created\",
0.272 | 24698: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: Scheduling refresh of Anchor[cinder::config::end]\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/storage_availability_zone]: The container Class[Cinder] will propagate my refresh event\",
0.272 | 24698: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]/ensure: created\",
0.272 | 24698: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: Scheduling refresh of Anchor[cinder::config::end]\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/default_availability_zone]: The container Class[Cinder] will propagate my refresh event\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.272 | 24698: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.272 | 24698: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]/ensure: created\",
0.272 | 24698: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]: Scheduling refresh of Anchor[cinder::config::end]\
0.271 | 24699: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ",
0.271 | 24699: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/host]: The container Class[Cinder] will propagate my refresh event\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.271 | 24699: \"Notice: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]/ensure: created\",
0.271 | 24699: \"Info: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: Scheduling refresh of Anchor[cinder::config::end]\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/enable_v3_api]: The container Class[Cinder] will propagate my refresh event\",
0.271 | 24699: \"Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]/ensure: created\",
0.271 | 24699: \"Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: Scheduling refresh of Anchor[cinder::config::end]\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_servers]: The container Class[Cinder::Glance] will propagate my refresh event\",
0.271 | 24699: \"Notice: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]/ensure: created\",
0.271 | 24699: \"Info: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: Scheduling refresh of Anchor[cinder::config::end]\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_version]: The container Class[Cinder::Glance] will propagate my refresh event\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.271 | 24699: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]: Nothing to manage: no en
0.199 | 24700: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: sure and the resource doesn't exist\",
0.199 | 24700: \"Debug: Class[Cinder::Glance]: The container Stage[main] will propagate my refresh event\",
0.199 | 24700: \"Debug: Executing: '/usr/bin/rpm -q openstack-tacker --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.199 | 24700: '\",
0.199 | 24700: \"Debug: Executing: '/usr/bin/rpm -q openstack-tacker --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.199 | 24700: --whatprovides'\",
0.199 | 24700: \"Debug: Package[tacker-server](provider=yum): Ensuring => present\",
0.199 | 24700: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install openstack-tacker'\",
0.199 | 24700: \"Notice: /Stage[main]/Tacker::Server/Package[tacker-server]/ensure: created\",
0.199 | 24700: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[cinder::service::end]\",
0.199 | 24700: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[tacker::install::end]\",
0.199 | 24700: \"Info: /Stage[main]/Tacker::Server/Package[tacker-server]: Scheduling refresh of Anchor[keystone::service::end]\",
0.199 | 24700: \"Debug: /Stage[main]/Tacker::Server/Package[tacker-server]: The container Class[Tacker::Server] will propagate my refresh event\",
0.199 | 24700: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: Triggered 'refresh' from 2 events\",
0.199 | 24700: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.199 | 24700: \"Notice: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Triggered 'refresh' from 1 events\",
0.199 | 24700: \"Info: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Scheduling refresh of Anchor[tacker::service::begin]\",
0.199 | 24700: \"Info: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: Scheduling refresh of Exec[tacker-db-sync]\",
0.199 | 24700: \"Debug: /Stage[main]/Tacker::Deps/Anchor[tacker::install::end]: The container Class[Tacker::Deps] will propagate my refresh event\",
0.199 | 24700: \"Notice: /Stag
0.029 | 24701: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: e[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]/ensure: created\",
0.029 | 24701: \"Info: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]: Scheduling refresh of Anchor[tacker::config::end]\",
0.029 | 24701: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_host]: The container Class[Tacker::Server] will propagate my refresh event\",
0.029 | 24701: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-enabled iptables'\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.029 | 24701: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.029 | 24701: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.029 | 24701: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.029 | 24701: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.029 | 24701: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.029 | 24701: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.029 | 24701: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.029 | 24701: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.029 | 24701: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.029 | 24701: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tr
0.358 | 24702: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ipleo::Profile::Base::Kernel] will propagate my refresh event\",
0.358 | 24702: \"Debug: Prefetching parsed resources for sysctl\",
0.358 | 24702: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.358 | 24702: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.358 | 24702: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.358 | 24702: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1w9knj2 returned \",
0.358 | 24702: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1w9knj2 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.358 | 24702: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/null 2>&1 -> \",
0.358 | 24702: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.358 | 24702: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.358 | 24702: \"Debug: Executing: '/usr/bin/rpm -q MySQL-python --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.358 | 24702: '\",
0.358 | 24702: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.358 | 24702: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.358 | 24702: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]/ensure: created\",
0.358 | 24702: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]: Scheduling refresh of Anchor[cinder::config::end]\",
0.358 | 24702: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.358 | 24702: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exis
0.255 | 24703: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: t\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]/ensure: created\",
0.255 | 24703: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]: Scheduling refresh of Anchor[cinder::config::end]\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.255 | 24703: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_co
0.247 | 24704: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: nfig]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 24704: \"Notice: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]/ensure: created\",
0.247 | 24704: \"Info: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]: Scheduling refresh of Anchor[cinder::config::end]\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]: The container Oslo::Db[cinder_config] will propagate my refresh event\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 24704: \"Debug: Oslo::Db[cinder_config]: The container Class[Cinder::Db] will propagate my refresh event\",
0.247 | 24704: \"Debug: Class[Cinder::Db]: The container Stage[main] will propagate my refresh event\",
0.247 | 24704: \"Notice: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]/ensure: created\",
0.247 | 24704: \"Info: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]: Scheduling refresh of Anchor[cinder::config::end]\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/debug]: The container Oslo::Log[cinder_config] will propagate my refresh event\",
0.247 | 24704: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: N
0.261 | 24705: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: othing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Notice: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]/ensure: created\",
0.261 | 24705: \"Info: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[cinder::config::end]\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_dir]: The container Oslo::Log[cinder_config] will propagate my refresh event\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 24705: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_default_format_stri
0.226 | 24706: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ng]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: Oslo::Log[cinder_config]: The container Class[Cinder::Logging] will propagate my refresh event\",
0.226 | 24706: \"Debug: Class[Cinder::Logging]: The container Stage[main] will propagate my refresh event\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.226 | 24706: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_r
0.230 | 24707: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: abbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created\",
0.230 | 24707: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: Scheduling refresh of Anchor[cinder::config::end]\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.230 | 24707: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]/ensure: created\",
0.230 | 24707:
0.147 | 24708: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]: Scheduling refresh of Anchor[cinder::config::end]\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_password]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.147 | 24708: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]/ensure: created\",
0.147 | 24708: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]: Scheduling refresh of Anchor[cinder::config::end]\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.147 | 24708: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]/ensure: created\",
0.147 | 24708: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: Scheduling refresh of Anchor[cinder::config::end]\",
0.147 | 24708: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_userid]: The container Oslo::Messaging::Rabbit[cinder_config] will pro
0.227 | 24709: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: pagate my refresh event\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]/ensure: created\",
0.227 | 24709: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]: Scheduling refresh of Anchor[cinder::config::end]\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_port]: The container Oslo::Messaging::Rabbit[cinder_config] will propagate my refresh event\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.227 | 24709: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabb
0.204 | 24710: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: it/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: Oslo::Messaging::Rabbit[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.204 | 24710: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messagin
0.173 | 24711: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: g_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 24711: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_co
0.182 | 24712: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: nfig]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 24712: \
0.239 | 24713: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Notice: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]/ensure: created\",
0.239 | 24713: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]: Scheduling refresh of Anchor[cinder::config::end]\",
0.239 | 24713: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]: The container Oslo::Messaging::Default[cinder_config] will propagate my refresh event\",
0.239 | 24713: \"Notice: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]/ensure: created\",
0.239 | 24713: \"Info: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]: Scheduling refresh of Anchor[cinder::config::end]\",
0.239 | 24713: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]: The container Oslo::Messaging::Default[cinder_config] will propagate my refresh event\",
0.239 | 24713: \"Debug: Oslo::Messaging::Default[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.239 | 24713: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.239 | 24713: \"Notice: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]/ensure: created\",
0.239 | 24713: \"Info: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]: Scheduling refresh of Anchor[cinder::config::end]\",
0.239 | 24713: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]: The container Oslo::Concurrency[cinder_config] will propagate my refresh event\",
0.239 | 24713: \"Debug: Oslo::Concurrency[cinder_config]: The container Class[Cinder] will propagate my refresh event\",
0.239 | 24713: \"Debug: Class[Cinder]: The container Stage[main] will propagate my refresh event\",
0.239 | 24713:
0.170 | 24714: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: \"Notice: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]/ensure: created\",
0.170 | 24714: \"Info: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]: Scheduling refresh of Anchor[cinder::config::end]\",
0.170 | 24714: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]: The container Oslo::Messaging::Notifications[cinder_config] will propagate my refresh event\",
0.170 | 24714: \"Notice: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/ensure: created\",
0.170 | 24714: \"Info: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]: Scheduling refresh of Anchor[cinder::config::end]\",
0.170 | 24714: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]: The container Oslo::Messaging::Notifications[cinder_config] will propagate my refresh event\",
0.170 | 24714: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.170 | 24714: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Triggered 'refresh' from 22 events\",
0.170 | 24714: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.170 | 24714: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.170 | 24714: \"Debug: Oslo::Messaging::Notifications[cinder_config]: The container Class[Cinder::Ceilometer] will propagate my refresh event\",
0.170 | 24714: \"Debug: Class[Cinder::Ceilometer]: The container Stage[m
0.213 | 24715: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: ain] will propagate my refresh event\",
0.213 | 24715: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-q2s22q returned \",
0.213 | 24715: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-q2s22q property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.213 | 24715: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.213 | 24715: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1hfhb4s returned \",
0.213 | 24715: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-64795-1hfhb4s property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.213 | 24715: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.213 | 24715: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 24715: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 24715: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 24715: \"Notice: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]/ensure: created\",
0.213 | 24715: \"Info: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]: Scheduling refresh of Anchor[tacker::config::end]\",
0.213 | 24715: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]: The container Oslo::Log[tacker_config] will propagate my refresh event\",
0.213 | 24715: \"Notice: /Stage[main]/Tacker::Logging/Oslo::
0.008 | 24716: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: Log[tacker_config]/Tacker_config[DEFAULT/log_dir]/ensure: created\",
0.008 | 24716: \"Info: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]: Scheduling refresh of Anchor[tacker::config::end]\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]: The container Oslo::Log[tacker_config] will propagate my refresh event\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 24716: \"Debug: /S

0.116 | 25366: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::amqp_username in JSON backend",
0.116 | 25367: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::amqp_password in JSON backend",
0.084 | 25368: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::package_ensure in JSON backend",
0.321 | 25369: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::api_paste_config in JSON backend",
0.212 | 25370: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::use_syslog in JSON backend",
0.212 | 25371: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::use_stderr in JSON backend",
0.364 | 25372: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::log_facility in JSON backend",
0.222 | 25373: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::log_dir in JSON backend",
0.198 | 25374: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::debug in JSON backend",
0.329 | 25375: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::storage_availability_zone in JSON backend",
0.329 | 25376: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::default_availability_zone in JSON backend",
0.335 | 25377: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::allow_availability_zone_fallback in JSON backend",
0.306 | 25378: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::enable_v3_api in JSON backend",
0.310 | 25379: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::lock_path in JSON backend",
0.335 | 25380: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::image_conversion_dir in JSON backend",
0.378 | 25381: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::purge_config in JSON backend",
0.311 | 25382: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backend_host in JSON backend",
0.000 | 25383: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::enable_v1_api in JSON backend",
0.000 | 25384: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::enable_v2_api in JSON backend",
0.371 | 25385: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::use_ssl in JSON backend",
0.363 | 25386: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::ca_file in JSON backend",
0.386 | 25387: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cert_file in JSON backend",
0.363 | 25388: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::key_file in JSON backend",
0.291 | 25389: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::rabbit_host in JSON backend",
0.290 | 25390: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::rabbit_hosts in JSON backend",
0.302 | 25391: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::rabbit_virtual_host in JSON backend",
0.121 | 25392: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::host in JSON backend",

0.000 | 25409: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::db::database_retry_interval in JSON backend",
0.000 | 25410: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::db::database_max_overflow in JSON backend",
0.105 | 25411: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/logging.pp' in environment production",
0.225 | 25412: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::logging from cinder/logging into production",
0.116 | 25413: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::logging::use_syslog in JSON backend",
0.116 | 25414: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::logging::use_stderr in JSON backend",
0.301 | 25415: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::logging::log_facility in JSON backend",
0.121 | 25416: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::logging::log_dir in JSON backend",

0.003 | 25434: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported oslo::messaging::amqp from oslo/messaging/amqp into production",
0.009 | 25435: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/oslo/manifests/messaging/default.pp' in environment production",
0.004 | 25436: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported oslo::messaging::default from oslo/messaging/default into production",
0.210 | 25437: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/oslo/manifests/concurrency.pp' in environment production",
0.314 | 25438: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported oslo::concurrency from oslo/concurrency into production",
0.193 | 25439: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/ceilometer.pp' in environment production",
0.213 | 25440: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::ceilometer from cinder/ceilometer into production",
0.179 | 25441: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::ceilometer::notification_driver in JSON backend",

0.003 | 25444: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported oslo::messaging::notifications from oslo/messaging/notifications into production",
0.145 | 25445: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/config.pp' in environment production",
0.134 | 25446: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::config from cinder/config into production",
0.246 | 25447: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::config::cinder_config in JSON backend",
0.391 | 25448: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::config::api_paste_ini_config in JSON backend",
0.188 | 25449: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/glance.pp' in environment production",
0.201 | 25450: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::glance from cinder/glance into production",
0.391 | 25451: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_servers in JSON backend",
0.391 | 25452: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_version in JSON backend",
0.397 | 25453: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_num_retries in JSON backend",
0.396 | 25454: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_insecure in JSON backend",
0.402 | 25455: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.402 | 25456: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.123 | 25457: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production",
0.147 | 25458: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported tripleo::profile::pacemaker::cinder::backup from tripleo/profile/pacemaker/cinder/backup into production",
0.210 | 25459: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::backup::bootstrap_node in JSON backend",
0.143 | 25460: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::backup::step in JSON backend",

0.033 | 25463: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up stack_action in JSON backend",
0.118 | 25464: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/volume.pp' in environment production",
0.136 | 25465: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported tripleo::profile::pacemaker::cinder::volume from tripleo/profile/pacemaker/cinder/volume into production",
0.231 | 25466: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::bootstrap_node in JSON backend",
0.136 | 25467: Nov 8 21:04:52 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::pacemaker::cinder::volume::step in JSON backend",

0.000 | 27228: Nov 8 21:06:13 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:python-cephfs-10.2.7-0.el7.x86_64
0.000 | 27229: Nov 8 21:06:13 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:python-rbd-10.2.7-0.el7.x86_64
0.000 | 27230: Nov 8 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: 1:libradosstriper1-10.2.7-0.el7.x86_64
0.268 | 27231: Nov 8 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: boost-program-options-1.53.0-27.el7.x86_64
0.000 | 27232: Nov 8 21:06:14 centos-7-rax-iad-0000787869 yum[108905]: Installed: fcgi-2.4.0-25.el7.x86_64

0.000 | 27273: Nov 8 21:06:23 centos-7-rax-iad-0000787869 su: (to rabbitmq) root on none
0.000 | 27274: Nov 8 21:06:24 centos-7-rax-iad-0000787869 su: (to rabbitmq) root on none
0.000 | 27275: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Triggered 'refresh' from 2 events
0.304 | 27276: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Scheduling refresh of Service[cinder-backup]
0.205 | 27277: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]) Scheduling refresh of Service[cinder-volume]
0.357 | 27278: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) Triggered 'refresh' from 1 events
0.366 | 27279: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Backup/Service[cinder-backup]) Scheduling refresh of Anchor[cinder::service::end]
0.353 | 27280: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) Triggered 'refresh' from 1 events
0.202 | 27281: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Volume/Service[cinder-volume]) Scheduling refresh of Anchor[cinder::service::end]
0.000 | 27282: Nov 8 21:06:25 centos-7-rax-iad-0000787869 puppet-user[106446]: (/Stage[main]/Cinder::Deps/Anchor[cinder::service::end]) Triggered 'refresh' from 2 events

0.000 | 29238: Nov 8 21:06:56 centos-7-rax-iad-0000787869 journal: INFO:__main__:Copying /var/lib/kolla/config_files/src-iscsid/etc/iscsi/initiatorname.iscsi to /etc/iscsi/initiatorname.iscsi
0.000 | 29239: Nov 8 21:06:56 centos-7-rax-iad-0000787869 journal: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
0.000 | 29240: Nov 8 21:06:56 centos-7-rax-iad-0000787869 journal: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/ceph.client.admin.keyring to /etc/ceph/ceph.client.admin.keyring
0.220 | 29241: Nov 8 21:06:56 centos-7-rax-iad-0000787869 journal: INFO:__main__:Deleting /etc/ceph/rbdmap
0.100 | 29242: Nov 8 21:06:56 centos-7-rax-iad-0000787869 journal: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/rbdmap to /etc/ceph/rbdmap

0.014 | 30792: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: der/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]/notify: subscribes to Anchor[cinder::config::end]\",
0.014 | 30792: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]
0.036 | 30793: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: /notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/transport_url]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/control_exchange]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/lock_path]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/driver]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/transport_url]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]/notify: subscribes to Anchor[cinder::config::end]\",
0.036 | 30793: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_ba
0.008 | 30794: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ckend_name]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.008 | 30794: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend:
0.232 | 30795: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: :Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]/notify: subscribes to Anchor[cinder::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]/notify: subscribes to Anchor[cinder::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]/before: subscribes to Anchor[cinder::install::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]/notify: subscribes to Anchor[cinder::service::begin]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_file]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_dir]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]/notify: subscribes to Anchor[tacker::config::end]\",
0.232 | 30795: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]/notify: subscribes to Anchor[tacker:
0.006 | 30796: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: :config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_format]/notify: subscribes to Anchor[tacker::config::end]\",
0.006 | 30796: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_uuid_format]/notify: subscribes to Anchor[tacker::con

0.032 | 30856: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: sh event\",
0.032 | 30856: \"Info: Computing checksum on file /etc/sysconfig/snmpd\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmpd.sysconfig]: Filebucketed /etc/sysconfig/snmpd to puppet with sum e914149a715dc82812a989314c026305\",
0.032 | 30856: \"Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{md5}e914149a715dc82812a989314c026305' to '{md5}1483b6eecf3d4796dac2df692d603719'\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmpd.sysconfig]: Scheduling refresh of Service[snmpd]\",
0.032 | 30856: \"Debug: /Stage[main]/Snmp/File[snmpd.sysconfig]: The container Class[Snmp] will propagate my refresh event\",
0.032 | 30856: \"Info: Computing checksum on file /etc/snmp/snmptrapd.conf\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmptrapd.conf]: Filebucketed /etc/snmp/snmptrapd.conf to puppet with sum 913e2613413a45daa402d0fbdbaba676\",
0.032 | 30856: \"Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{md5}913e2613413a45daa402d0fbdbaba676' to '{md5}0f92e52f70b5c64864657201eb9581bb'\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmptrapd.conf]: Scheduling refresh of Service[snmptrapd]\",
0.032 | 30856: \"Debug: /Stage[main]/Snmp/File[snmptrapd.conf]: The container Class[Snmp] will propagate my refresh event\",
0.032 | 30856: \"Info: Computing checksum on file /etc/sysconfig/snmptrapd\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: Filebucketed /etc/sysconfig/snmptrapd to puppet with sum 4496fd5e0e88e764e7beb1ae8f0dda6a\",
0.032 | 30856: \"Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{md5}4496fd5e0e88e764e7beb1ae8f0dda6a' to '{md5}01f68b1480c1ec4e3cc125434dd612a0'\",
0.032 | 30856: \"Info: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: Scheduling refresh of Service[snmptrapd]\",
0.032 | 30856: \"Debug: /Stage[main]/Snmp/File[snmptrapd.sysconfig]: The container Class[Snmp] will propagate my refresh event\",
0.032 | 30856: \"Debug: Executing: '/usr/bin/systemctl is-active snmptrapd'\",
0.032 | 30856: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmptrapd'\",
0.032 | 30856: \"Debug: /Stage[m
0.056 | 30857: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ain]/Snmp/Service[snmptrapd]: Skipping restart; service is not running\",
0.056 | 30857: \"Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events\",
0.056 | 30857: \"Debug: /Stage[main]/Snmp/Service[snmptrapd]: The container Class[Snmp] will propagate my refresh event\",
0.056 | 30857: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-enabled iptables'\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.056 | 30857: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.056 | 30857: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.056 | 30857: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.056 | 30857: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.056 | 30857: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.056 | 30857: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.056 | 30857: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.056 | 30857: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.056 | 30857: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.056 | 30857: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tripleo::Profile::Base::Kernel] will propagate my refresh event\",
0.056 | 30857:
0.000 | 30858: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: Prefetching parsed resources for sysctl\",
0.000 | 30858: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.000 | 30858: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.000 | 30858: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.000 | 30858: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1hlkod7 returned \",
0.000 | 30858: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1hlkod7 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.000 | 30858: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/null 2>&1 -> \",
0.000 | 30858: \"Debug: Exec[create-snmpv3-user-ro_snmp_user](provider=posix): Executing 'service snmpd stop ; sleep 5 ; echo \\\"createUser ro_snmp_user MD5 \\\\\\\"314c09c8f3fb56e8a169c4375356bb75fbdbf3d3\\\\\\\"\\\" >>/var/lib/net-snmp/snmpd.conf && touch /var/lib/net-snmp/ro_snmp_user-snmpd'\",
0.000 | 30858: \"Debug: Executing with uid=root: 'service snmpd stop ; sleep 5 ; echo \\\"createUser ro_snmp_user MD5 \\\\\\\"314c09c8f3fb56e8a169c4375356bb75fbdbf3d3\\\\\\\"\\\" >>/var/lib/net-snmp/snmpd.conf && touch /var/lib/net-snmp/ro_snmp_user-snmpd'\",
0.000 | 30858: \"Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully\",
0.000 | 30858: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]: The container Snmp::Snmpv3_user[ro_snmp_user] will propagate my refresh event\",
0.000 | 30858: \"Debug: Snmp::Snmpv3_user[ro_snmp_user]: The container Class[Tripleo::Profile::Base::Snmp] will propagate my refresh event\",
0.000 | 30858: \"Debug: Class[Tripleo::Profile::Base::Snmp]: The container Stage[main] will propagate my refresh event\",
0.000 | 30858: \"Debug: Executing: '/usr/bin/systemctl is-active snmpd'\",
0.000 | 30858: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmpd'\",
0.000 | 30858:
0.287 | 30859: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: Executing: '/usr/bin/systemctl unmask snmpd'\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/systemctl start snmpd'\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/systemctl enable snmpd'\",
0.287 | 30859: \"Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'\",
0.287 | 30859: \"Debug: /Stage[main]/Snmp/Service[snmpd]: The container Class[Snmp] will propagate my refresh event\",
0.287 | 30859: \"Info: /Stage[main]/Snmp/Service[snmpd]: Unscheduling refresh on Service[snmpd]\",
0.287 | 30859: \"Debug: Class[Snmp]: The container Stage[main] will propagate my refresh event\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.287 | 30859: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1p862cb returned \",
0.287 | 30859: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-1p862cb property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.287 | 30859: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.287 | 30859: \"Debug: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.287 | 30859: '\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/rpm -q ceph-common --nosignature --nodigest --qf %{NAME} %|EPOCH?{%{EPOCH}}:{0}| %{VERSION} %{RELEASE} %{ARCH}\\\
0.287 | 30859: --whatprovides'\",
0.287 | 30859: \"Debug: Package[ceph-common](provider=yum): Ensuring => present\",
0.287 | 30859: \"Debug: Executing: '/usr/bin/yum -d 0 -e 0 -y install ceph-common'\",
0.287 | 30859: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]/ensure: created\",
0.287 | 30859: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Package[ceph-common]: The container Cinder::Backend::Rbd[tripleo_
0.301 | 30860: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ceph] will propagate my refresh event\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_request_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_manager]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_api_class]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_name_template]: Nothing to manage: no ensure and the resource doesn't exist\",
0.301 | 30860: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_driver]/ensure: created\",
0.301 | 30860: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_drive
0.205 | 30861: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: r]: Scheduling refresh of Anchor[cinder::config::end]\",
0.205 | 30861: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_driver]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.205 | 30861: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]/ensure: created\",
0.205 | 30861: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]: Scheduling refresh of Anchor[cinder::config::end]\",
0.205 | 30861: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_conf]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.205 | 30861: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]/ensure: created\",
0.205 | 30861: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]: Scheduling refresh of Anchor[cinder::config::end]\",
0.205 | 30861: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_user]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.205 | 30861: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]/ensure: created\",
0.205 | 30861: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]: Scheduling refresh of Anchor[cinder::config::end]\",
0.205 | 30861: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_chunk_size]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.205 | 30861: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]/ensure: created\",
0.205 | 30861: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]: Scheduling refresh of Anchor[cinder::config::end]\",
0.205 | 30861: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_pool]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.205 | 30861: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ce
0.282 | 30862: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ph_stripe_unit]/ensure: created\",
0.282 | 30862: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_unit]: Scheduling refresh of Anchor[cinder::config::end]\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_unit]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.282 | 30862: \"Notice: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]/ensure: created\",
0.282 | 30862: \"Info: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]: Scheduling refresh of Anchor[cinder::config::end]\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Backup::Ceph/Cinder_config[DEFAULT/backup_ceph_stripe_count]: The container Class[Cinder::Backup::Ceph] will propagate my refresh event\",
0.282 | 30862: \"Debug: Class[Cinder::Backup::Ceph]: The container Stage[main] will propagate my refresh event\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Nothing to manage: no ensure and the resource doesn't exist\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_ionice]: Nothing to manage: no ensure and the resource doesn't exist\",
0.282 | 30862: \"Notice: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]/ensure: created\",
0.282 | 30862: \"Info: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: Scheduling refresh of Anchor[cinder::config::end]\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Backends/Cinder_config[DEFAULT/enabled_backends]: The container Class[Cinder::Backends] will propagate my refresh event\",
0.282 | 30862: \"Debug: Class[Cinder::Backends]: The container Stage[main] will propagate my refresh event\",
0.282 | 30862: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the
0.197 | 30863: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.197 | 30863: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn
0.237 | 30864: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: 't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.237 | 30864: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to mana
0.211 | 30865: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ge: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.211 | 30865: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and
0.183 | 30866: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.183 | 30866: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config:
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.203 | 30867: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_am
0.173 | 30868: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: qp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.173 | 30868: \"Debu
0.182 | 30869: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: g: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.182 | 30869: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't
0.247 | 30870: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.247 | 30870: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_backend_name]/ensure: created\",
0.247 | 30870: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_con
0.077 | 30871: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: fig[tripleo_ceph/volume_backend_name]: Scheduling refresh of Anchor[cinder::config::end]\",
0.077 | 30871: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_backend_name]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.077 | 30871: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]/ensure: created\",
0.077 | 30871: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]: Scheduling refresh of Anchor[cinder::config::end]\",
0.077 | 30871: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/volume_driver]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.077 | 30871: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]/ensure: created\",
0.077 | 30871: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]: Scheduling refresh of Anchor[cinder::config::end]\",
0.077 | 30871: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_ceph_conf]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.077 | 30871: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]/ensure: created\",
0.077 | 30871: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_user]: Scheduling refresh of Anchor[cinder::config::end]\",
0.077 | 30871: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph
0.102 | 30872: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ]/Cinder_config[tripleo_ceph/rbd_user]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.102 | 30872: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]/ensure: created\",
0.102 | 30872: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]: Scheduling refresh of Anchor[cinder::config::end]\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_pool]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]: Nothing to manage: no ensure and the resource doesn't exist\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]: Nothing to manage: no ensure and the resource doesn't exist\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]: Nothing to manage: no ensure and the resource doesn't exist\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.102 | 30872: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]: Nothing to manage: no ens
0.105 | 30873: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ure and the resource doesn't exist\",
0.105 | 30873: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.105 | 30873: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]/ensure: created\",
0.105 | 30873: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]: Scheduling refresh of Anchor[cinder::config::end]\",
0.105 | 30873: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/backend_host]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.105 | 30873: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Triggered 'refresh' from 14 events\",
0.105 | 30873: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: Scheduling refresh of Anchor[cinder::service::begin]\",
0.105 | 30873: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::config::end]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.105 | 30873: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File[/etc/sysconfig/openstack-cinder-volume]/ensure: created\",
0.105 | 30873: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File[/etc/sysconfig/openstack-cinder-volume]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.105 | 30873: \"Notice: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]/ensure: created\",
0.105 | 30873: \"Info: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]: Scheduling refresh of Anchor[cinder::servi
0.209 | 30874: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: ce::begin]\",
0.209 | 30874: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/File_line[set initscript env tripleo_ceph]: The container Cinder::Backend::Rbd[tripleo_ceph] will propagate my refresh event\",
0.209 | 30874: \"Debug: Cinder::Backend::Rbd[tripleo_ceph]: The container Class[Tripleo::Profile::Base::Cinder::Volume::Rbd] will propagate my refresh event\",
0.209 | 30874: \"Debug: Class[Tripleo::Profile::Base::Cinder::Volume::Rbd]: The container Stage[main] will propagate my refresh event\",
0.209 | 30874: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-h8qtqt returned \",
0.209 | 30874: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-106446-h8qtqt property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.209 | 30874: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.209 | 30874: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource d
0.008 | 30875: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: oesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.008 | 30875: \"Debug: /Stage[ma

0.002 | 30883: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: g]/Tacker_config[keystone_authtoken/keyfile]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_conn_get_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_dead_retry]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_maxsize]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_socket_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_unused_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_secret_key]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_security_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_use_advanced_pool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 30883: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/
0.023 | 30884: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: Tacker_config[keystone_authtoken/memcached_servers]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/region_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/token_cache_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.023 | 30884: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_retrie
0.004 | 30885: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: s]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 30885: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Triggered 'refresh'
0.316 | 30886: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: from 2 events\",
0.316 | 30886: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Scheduling refresh of Service[cinder-backup]\",
0.316 | 30886: \"Info: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: Scheduling refresh of Service[cinder-volume]\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::begin]: The container Class[Cinder::Deps] will propagate my refresh event\",
0.316 | 30886: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-backup'\",
0.316 | 30886: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-cinder-backup'\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Skipping restart; service is not running\",
0.316 | 30886: \"Notice: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Triggered 'refresh' from 1 events\",
0.316 | 30886: \"Info: /Stage[main]/Cinder::Backup/Service[cinder-backup]: Scheduling refresh of Anchor[cinder::service::end]\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Backup/Service[cinder-backup]: The container Class[Cinder::Backup] will propagate my refresh event\",
0.316 | 30886: \"Debug: Class[Cinder::Backup]: The container Stage[main] will propagate my refresh event\",
0.316 | 30886: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-volume'\",
0.316 | 30886: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-cinder-volume'\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Skipping restart; service is not running\",
0.316 | 30886: \"Notice: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Triggered 'refresh' from 1 events\",
0.316 | 30886: \"Info: /Stage[main]/Cinder::Volume/Service[cinder-volume]: Scheduling refresh of Anchor[cinder::service::end]\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Volume/Service[cinder-volume]: The container Class[Cinder::Volume] will propagate my refresh event\",
0.316 | 30886: \"Notice: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: Triggered 'refresh' from 2 events\",
0.316 | 30886: \"Debug: /Stage[main]/Cinder::Deps/Anchor[cinder::service::end]: The contain
0.274 | 30887: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: er Class[Cinder::Deps] will propagate my refresh event\",
0.274 | 30887: \"Debug: Class[Cinder::Deps]: The container Stage[main] will propagate my refresh event\",
0.274 | 30887: \"Debug: Class[Cinder::Volume]: The container Stage[main] will propagate my refresh event\",
0.274 | 30887: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-tacker-server'\",
0.274 | 30887: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-tacker-server'\",
0.274 | 30887: \"Debug: Prefetching iptables resources for firewall\",
0.274 | 30887: \"Debug: Puppet::Type::Firewall::ProviderIptables: [prefetch(resources)]\",
0.274 | 30887: \"Debug: Puppet::Type::Firewall::ProviderIptables: [instances]\",
0.274 | 30887: \"Debug: Executing: '/usr/sbin/iptables-save'\",
0.274 | 30887: \"Debug: Prefetching ip6tables resources for firewall\",
0.274 | 30887: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [prefetch(resources)]\",
0.274 | 30887: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [instances]\",
0.274 | 30887: \"Debug: Executing: '/usr/sbin/ip6tables-save'\",
0.274 | 30887: \"Debug: Finishing transaction 54723480\",
0.274 | 30887: \"Debug: Storing state\",
0.274 | 30887: \"Debug: Stored state in 0.07 seconds\",
0.274 | 30887: \"Notice: Applied catalog in 33.72 seconds\",
0.274 | 30887: \"Debug: Applying settings catalog for sections reporting, metrics\",
0.274 | 30887: \"Debug: Finishing transaction 105888360\",
0.274 | 30887: \"Debug: Received report to process from centos-7-rax-iad-0000787869.localdomain\",
0.274 | 30887: \"Debug: Processing report from centos-7-rax-iad-0000787869.localdomain with processor Puppet::Reports::Store\"
0.274 | 30887: ],
0.274 | 30887: \"failed_when_result\": false
0.274 | 30887: }
0.274 | 30887:
0.274 | 30887: TASK [Run docker-puppet tasks (generate config)] *******************************
0.274 | 30887: skipping: [localhost]
0.274 | 30887:
0.274 | 30887: TASK [debug] *******************************************************************
0.274 | 30887: ok: [localhost] => {
0.274 | 30887: \"(outputs.stderr|default('')).split('\
0.274 | 30887: ')|union(outputs.stdout_lines|default([]))\": [
0.274 | 30887: \"\"
0.274 | 30887: ],
0.274 | 30887: \"failed_when_result\": false
0.274 | 30887: }
0.274 | 30887:
0.274 | 30887: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-star
0.000 | 30888: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: tup-config-step_4.json exists] ***
0.000 | 30888: ok: [localhost]
0.000 | 30888:
0.000 | 30888: TASK [Start containers for step 4] *********************************************
0.000 | 30888: ok: [localhost]
0.000 | 30888:
0.000 | 30888: TASK [debug] *******************************************************************
0.000 | 30888: ok: [localhost] => {
0.000 | 30888: \"(outputs.stderr|default('')).split('\
0.000 | 30888: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 30888: \"stdout: 6ae6fe49e2f13e65af25e6d92bf09940d5048c3d8257ac4cb68aad6a449b5155\",
0.000 | 30888: \"\",
0.000 | 30888: \"stderr: Unable to find image '192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' locally\",
0.000 | 30888: \"Trying to pull repository 192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator ... \",
0.000 | 30888: \"3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d: Pulling from 192.168.24.1:8787/tripleomaster/centos-binary-aodh-evaluator\",
0.000 | 30888: \"d9aaf4d82f24: Already exists\",
0.000 | 30888: \"615fb2b6a1f1: Already exists\",
0.000 | 30888: \"3013007117c8: Already exists\",
0.000 | 30888: \"72133c850d33: Already exists\",
0.000 | 30888: \"c2baf92c99f8: Already exists\",
0.000 | 30888: \"c33a905d0cfb: Already exists\",
0.000 | 30888: \"0e2281a8f625: Already exists\",
0.000 | 30888: \"8db9532c7c2a: Already exists\",
0.000 | 30888: \"a2fdf405ce12: Already exists\",
0.000 | 30888: \"b4d23af701db: Already exists\",
0.000 | 30888: \"c0364d012ec6: Already exists\",
0.000 | 30888: \"5da3106f315c: Already exists\",
0.000 | 30888: \"7115c908a774: Already exists\",
0.000 | 30888: \"6bfb3cfd80b3: Already exists\",
0.000 | 30888: \"8d6928a9593d: Already exists\",
0.000 | 30888: \"26bc5dc8da6d: Already exists\",
0.000 | 30888: \"76a6f33737df: Already exists\",
0.000 | 30888: \"f200f3bea052: Already exists\",
0.000 | 30888: \"86a355b07979: Already exists\",
0.000 | 30888: \"f6c0fe59d156: Already exists\",
0.000 | 30888: \"2d2aa5dd2564: Already exists\",
0.000 | 30888: \"6478d58b62d6: Already exists\",
0.000 | 30888: \"a3747001f778: Already exists\",
0.000 | 30888: \"f50228f8bd7f: Already exists\",
0.000 | 30888: \"3f77d8c2dda3: Already exists\",
0.000 | 30888: \"984000356753: Already exists\",
0.000 | 30888: \"54bb31bcfca3: Already ex

0.000 | 31673: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.000 | 31674: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.194 | 31675: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/backup.pp' in environment production",
0.214 | 31676: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::backup from cinder/backup into production",
0.241 | 31677: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::enabled in JSON backend",
0.187 | 31678: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::manage_service in JSON backend",
0.148 | 31679: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::package_ensure in JSON backend",
0.346 | 31680: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::backup_manager in JSON backend",
0.346 | 31681: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::backup_api_class in JSON backend",
0.345 | 31682: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::backup_name_template in JSON backend",
0.368 | 31683: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::backup_topic in JSON backend",
0.242 | 31684: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/backup/ceph.pp' in environment production",
0.314 | 31685: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::backup::ceph from cinder/backup/ceph into production",
0.434 | 31686: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_driver in JSON backend",
0.434 | 31687: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_conf in JSON backend",
0.434 | 31688: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_user in JSON backend",
0.434 | 31689: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_chunk_size in JSON backend",
0.434 | 31690: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_pool in JSON backend",
0.434 | 31691: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_stripe_unit in JSON backend",
0.434 | 31692: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::backup::ceph::backup_ceph_stripe_count in JSON backend",
0.000 | 31693: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/pacemaker/cinder/backup.pp' in environment production",

0.000 | 31721: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::step in JSON backend",
0.000 | 31722: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder_user_enabled_backends in JSON backend",
0.186 | 31723: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/volume.pp' in environment production",
0.256 | 31724: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::volume from cinder/volume into production",
0.175 | 31725: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::package_ensure in JSON backend",
0.283 | 31726: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::enabled in JSON backend",
0.210 | 31727: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::manage_service in JSON backend",
0.357 | 31728: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::volume_clear in JSON backend",
0.356 | 31729: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::volume_clear_size in JSON backend",
0.357 | 31730: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::volume::volume_clear_ionice in JSON backend",
0.186 | 31731: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/volume/rbd.pp' in environment production",
0.090 | 31732: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported tripleo::profile::base::cinder::volume::rbd from tripleo/profile/base/cinder/volume/rbd into production",
0.247 | 31733: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::backend_name in JSON backend",
0.247 | 31734: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::cinder_rbd_backend_host in JSON backend",
0.044 | 31735: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up tripleo::profile::base::cinder::volume::rbd::cinder_rbd_pool_name in JSON backend",

0.000 | 32141: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_api_insecure] with 'before'",
0.000 | 32142: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_api_ssl_compression] with 'before'",
0.000 | 32143: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/glance_request_timeout] with 'before'",
0.221 | 32144: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_manager] with 'before'",
0.221 | 32145: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_api_class] with 'before'",
0.219 | 32146: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_name_template] with 'before'",
0.230 | 32147: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_driver] with 'before'",
0.230 | 32148: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_conf] with 'before'",
0.230 | 32149: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_user] with 'before'",
0.230 | 32150: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_chunk_size] with 'before'",
0.230 | 32151: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_pool] with 'before'",
0.230 | 32152: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_stripe_unit] with 'before'",
0.230 | 32153: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/backup_ceph_stripe_count] with 'before'",
0.046 | 32154: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Anchor[cinder::config::begin] to Cinder_config[DEFAULT/volume_clear] with 'before'",

0.000 | 32286: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_api_insecure] to Anchor[cinder::config::end] with 'notify'",
0.000 | 32287: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_api_ssl_compression] to Anchor[cinder::config::end] with 'notify'",
0.000 | 32288: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/glance_request_timeout] to Anchor[cinder::config::end] with 'notify'",
0.230 | 32289: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_manager] to Anchor[cinder::config::end] with 'notify'",
0.230 | 32290: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_api_class] to Anchor[cinder::config::end] with 'notify'",
0.229 | 32291: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_name_template] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32292: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_driver] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32293: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_conf] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32294: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_user] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32295: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_chunk_size] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32296: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_pool] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32297: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_stripe_unit] to Anchor[cinder::config::end] with 'notify'",
0.240 | 32298: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/backup_ceph_stripe_count] to Anchor[cinder::config::end] with 'notify'",
0.049 | 32299: Nov 8 21:07:14 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Cinder_config[DEFAULT/volume_clear] to Anchor[cinder::config::end] with 'notify'",

0.000 | 32828: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysctl.conf]/selrole: Found selrole default 'object_r' for /etc/sysctl.conf",
0.000 | 32829: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysctl.conf]/seltype: Found seltype default 'system_conf_t' for /etc/sysctl.conf",
0.000 | 32830: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysctl.conf]/selrange: Found selrange default 's0' for /etc/sysctl.conf",
0.316 | 32831: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/seluser: Found seluser default 'system_u' for /etc/sysconfig/openstack-cinder-volume",
0.242 | 32832: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/selrole: Found selrole default 'object_r' for /etc/sysconfig/openstack-cinder-volume",
0.316 | 32833: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/seltype: Found seltype default 'etc_t' for /etc/sysconfig/openstack-cinder-volume",
0.260 | 32834: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /File[/etc/sysconfig/openstack-cinder-volume]/selrange: Found selrange default 's0' for /etc/sysconfig/openstack-cinder-volume",
0.000 | 32835: Nov 8 21:07:15 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Firewall[000 accept related established rules ipv4]: [validate]",

0.000 | 33598: Nov 8 21:08:47 centos-7-rax-iad-0000787869 pengine[12541]: notice: Calculated transition 43, saving inputs in /var/lib/pacemaker/pengine/pe-input-42.bz2
0.078 | 33599: Nov 8 21:08:47 centos-7-rax-iad-0000787869 crmd[12542]: notice: Initiating start operation openstack-cinder-backup_start_0 locally on centos-7-rax-iad-0000787869
0.000 | 33600: Nov 8 21:08:47 centos-7-rax-iad-0000787869 systemd: Reloading.
0.403 | 33601: Nov 8 21:08:47 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]/ensure) created
0.000 | 33602: Nov 8 21:08:47 centos-7-rax-iad-0000787869 systemd: [/etc/systemd/system/ceph-mon@.service:8] Executable path is not absolute, ignoring: $(command -v mkdir) -p /etc/ceph /var/lib/ceph/mon
0.000 | 33603: Nov 8 21:08:47 centos-7-rax-iad-0000787869 systemd: [/usr/lib/systemd/system/ip6tables.service:3] Failed to add dependency on syslog.target,iptables.service, ignoring: Invalid argument
0.000 | 33604: Nov 8 21:08:47 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 33605: Nov 8 21:08:47 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.520 | 33606: Nov 8 21:08:48 centos-7-rax-iad-0000787869 systemd: Started Cluster Controlled openstack-cinder-backup.
0.529 | 33607: Nov 8 21:08:48 centos-7-rax-iad-0000787869 systemd: Starting Cluster Controlled openstack-cinder-backup...
0.134 | 33608: Nov 8 21:08:50 centos-7-rax-iad-0000787869 crmd[12542]: notice: Result of start operation for openstack-cinder-backup on centos-7-rax-iad-0000787869: 0 (ok)

0.000 | 33654: Nov 8 21:08:59 centos-7-rax-iad-0000787869 pengine[12541]: notice: Calculated transition 46, saving inputs in /var/lib/pacemaker/pengine/pe-input-45.bz2
0.078 | 33655: Nov 8 21:08:59 centos-7-rax-iad-0000787869 crmd[12542]: notice: Initiating start operation openstack-cinder-volume_start_0 locally on centos-7-rax-iad-0000787869
0.000 | 33656: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: Reloading.
0.209 | 33657: Nov 8 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: (/Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]/ensure) created
0.000 | 33658: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: [/etc/systemd/system/ceph-mon@.service:8] Executable path is not absolute, ignoring: $(command -v mkdir) -p /etc/ceph /var/lib/ceph/mon
0.000 | 33659: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: [/usr/lib/systemd/system/ip6tables.service:3] Failed to add dependency on syslog.target,iptables.service, ignoring: Invalid argument
0.000 | 33660: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.000 | 33661: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: Configuration file /etc/systemd/system/glean@.service.d/override.conf is marked executable. Please remove executable permission bits. Proceeding anyway.
0.477 | 33662: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: Started Cluster Controlled openstack-cinder-volume.
0.472 | 33663: Nov 8 21:08:59 centos-7-rax-iad-0000787869 systemd: Starting Cluster Controlled openstack-cinder-volume...
0.000 | 33664: Nov 8 21:08:59 centos-7-rax-iad-0000787869 puppet-user[119948]: Applied catalog in 39.99 seconds

0.040 | 34437: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: up snmp::trap_service_hasstatus in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::trap_service_hasrestart in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::template_snmpd_conf in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::template_snmpd_sysconfig in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::template_snmptrapd in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::template_snmptrapd_sysconfig in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::openmanage_enable in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::master in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::agentx_perms in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::agentx_ping_interval in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::agentx_socket in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::agentx_timeout in JSON backend\",
0.040 | 34437: \"Debug: hiera(): Looking up snmp::agentx_retries in JSON backend\",
0.040 | 34437: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmpd.conf.erb\",
0.040 | 34437: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.conf.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmpd.conf.erb in 0.01 seconds\",
0.040 | 34437: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.conf.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmpd.conf.erb in 0.00 seconds\",
0.040 | 34437: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmpd.sysconfig-RedHat.erb\",
0.040 | 34437: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb in 0.00 seconds\",
0.040 | 34437: \"Debug: template[/etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmpd.sysconfig-RedHat.erb in 0.00 seconds\",
0.040 | 34437: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmptrapd.conf.erb\",
0.053 | 34438: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config:
0.053 | 34438: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.conf.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmptrapd.conf.erb in 0.00 seconds\",
0.053 | 34438: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.conf.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmptrapd.conf.erb in 0.00 seconds\",
0.053 | 34438: \"Debug: Scope(Class[Snmp]): Retrieving template snmp/snmptrapd.sysconfig-RedHat.erb\",
0.053 | 34438: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb]: Bound template variables for /etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb in 0.00 seconds\",
0.053 | 34438: \"Debug: template[/etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb]: Interpolated template /etc/puppet/modules/snmp/templates/snmptrapd.sysconfig-RedHat.erb in 0.00 seconds\",
0.053 | 34438: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/sshd.pp' in environment production\",
0.053 | 34438: \"Debug: Automatically imported tripleo::profile::base::sshd from tripleo/profile/base/sshd into production\",
0.053 | 34438: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::bannertext in JSON backend\",
0.053 | 34438: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::motd in JSON backend\",
0.053 | 34438: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::options in JSON backend\",
0.053 | 34438: \"Debug: hiera(): Looking up tripleo::profile::base::sshd::port in JSON backend\",
0.053 | 34438: \"Debug: hiera(): Looking up ssh:server::options in JSON backend\",
0.053 | 34438: \"Debug: importing '/etc/puppet/modules/ssh/manifests/init.pp' in environment production\",
0.053 | 34438: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server.pp' in environment production\",
0.053 | 34438: \"Debug: Automatically imported ssh::server from ssh/server into production\",
0.053 | 34438: \"Debug: importing '/etc/puppet/modules/ssh/manifests/params.pp' in environment production\",
0.053 | 34438: \"Debug: Automatically imported ssh::params from ssh/params into
0.015 | 34439: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: production\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::ensure in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::validate_sshd_file in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::use_augeas in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::options_absent in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::match_block in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::use_issue_net in JSON backend\",
0.015 | 34439: \"Debug: hiera(): Looking up ssh::server::options in JSON backend\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/install.pp' in environment production\",
0.015 | 34439: \"Debug: Automatically imported ssh::server::install from ssh/server/install into production\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/config.pp' in environment production\",
0.015 | 34439: \"Debug: Automatically imported ssh::server::config from ssh/server/config into production\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/concat/manifests/init.pp' in environment production\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/stdlib/manifests/init.pp' in environment production\",
0.015 | 34439: \"Debug: Automatically imported concat from concat into production\",
0.015 | 34439: \"Debug: Scope(Class[Ssh::Server::Config]): Retrieving template ssh/sshd_config.erb\",
0.015 | 34439: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Bound template variables for /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.015 | 34439: \"Debug: template[/etc/puppet/modules/ssh/templates/sshd_config.erb]: Interpolated template /etc/puppet/modules/ssh/templates/sshd_config.erb in 0.00 seconds\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/concat/manifests/fragment.pp' in environment production\",
0.015 | 34439: \"Debug: Automatically imported concat::fragment from concat/fragment into production\",
0.015 | 34439: \"Debug: importing '/etc/puppet/modules/ssh/manifests/server/se
0.214 | 34440: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: rvice.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported ssh::server::service from ssh/server/service into production\",
0.214 | 34440: \"Debug: hiera(): Looking up ssh::server::service::ensure in JSON backend\",
0.214 | 34440: \"Debug: hiera(): Looking up ssh::server::service::enable in JSON backend\",
0.214 | 34440: \"Debug: importing '/etc/puppet/modules/timezone/manifests/init.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported timezone from timezone into production\",
0.214 | 34440: \"Debug: importing '/etc/puppet/modules/timezone/manifests/params.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported timezone::params from timezone/params into production\",
0.214 | 34440: \"Debug: hiera(): Looking up timezone::ensure in JSON backend\",
0.214 | 34440: \"Debug: hiera(): Looking up timezone::timezone in JSON backend\",
0.214 | 34440: \"Debug: hiera(): Looking up timezone::hwutc in JSON backend\",
0.214 | 34440: \"Debug: hiera(): Looking up timezone::autoupgrade in JSON backend\",
0.214 | 34440: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup/ceph.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported tripleo::profile::base::cinder::backup::ceph from tripleo/profile/base/cinder/backup/ceph into production\",
0.214 | 34440: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::ceph::step in JSON backend\",
0.214 | 34440: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder/backup.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported tripleo::profile::base::cinder::backup from tripleo/profile/base/cinder/backup into production\",
0.214 | 34440: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::backup::step in JSON backend\",
0.214 | 34440: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/profile/base/cinder.pp' in environment production\",
0.214 | 34440: \"Debug: Automatically imported tripleo::profile::base::cinder from tripleo/profile/base/cinder into production\",
0.214 | 34440: \"De
0.067 | 34441: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: bug: hiera(): Looking up tripleo::profile::base::cinder::bootstrap_node in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::cinder_enable_db_purge in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::step in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_proto in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_hosts in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_password in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_port in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_rpc_username in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_proto in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_hosts in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_password in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_port in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_notify_username in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up tripleo::profile::base::cinder::oslomsg_use_ssl in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up bootstrap_nodeid in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up messaging_rpc_service_name in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up rabbitmq_node_names in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up cinder::rabbit_password in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up cinder::rabbit_port in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up cinder::rabbit_userid in JSON backend\",
0.067 | 34441: \"Debug: hiera(): Looking up messaging_notify_serv

0.000 | 34460: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: ): Looking up firewall::ensure_v6 in JSON backend\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::pkg_ensure in JSON backend\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::service_name in JSON backend\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::service_name_v6 in JSON backend\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::package_name in JSON backend\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::ebtables_manage in JSON backend\",
0.000 | 34460: \"Debug: importing '/etc/puppet/modules/firewall/manifests/linux.pp' in environment production\",
0.000 | 34460: \"Debug: Automatically imported firewall::linux from firewall/linux into production\",
0.000 | 34460: \"Debug: importing '/etc/puppet/modules/firewall/manifests/linux/redhat.pp' in environment production\",
0.000 | 34460: \"Debug: Automatically imported firewall::linux::redhat from firewall/linux/redhat into production\",
0.000 | 34460: \"Debug: hiera(): Looking up firewall::linux::redhat::package_ensure in JSON backend\",
0.000 | 34460: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/rule.pp' in environment production\",
0.000 | 34460: \"Debug: Automatically imported tripleo::firewall::rule from tripleo/firewall/rule into production\",
0.000 | 34460: \"Debug: Resource class[tripleo::firewall::post] was not determined to be defined\",
0.000 | 34460: \"Debug: Create new resource class[tripleo::firewall::post] with params {\\\"firewall_settings\\\"=>{}}\",
0.000 | 34460: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/post.pp' in environment production\",
0.000 | 34460: \"Debug: Automatically imported tripleo::firewall::post from tripleo/firewall/post into production\",
0.000 | 34460: \"Debug: hiera(): Looking up tripleo::firewall::post::debug in JSON backend\",
0.000 | 34460: \"Notice: Scope(Class[Tripleo::Firewall::Post]): At this stage, all network traffic is blocked.\",
0.000 | 34460: \"Debug: hiera(): Looking up service_names in JSON backend\",
0.000 | 34460: \"Debug: importing '/etc/puppet/modules/tripleo/manifests/firewall/service_rules.pp' in environment production
0.070 | 34461: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: \",
0.070 | 34461: \"Debug: Automatically imported tripleo::firewall::service_rules from tripleo/firewall/service_rules into production\",
0.070 | 34461: \"Debug: Scope(Kmod::Load[nf_conntrack]): Retrieving template kmod/redhat.modprobe.erb\",
0.070 | 34461: \"Debug: template[/etc/puppet/modules/kmod/templates/redhat.modprobe.erb]: Bound template variables for /etc/puppet/modules/kmod/templates/redhat.modprobe.erb in 0.00 seconds\",
0.070 | 34461: \"Debug: template[/etc/puppet/modules/kmod/templates/redhat.modprobe.erb]: Interpolated template /etc/puppet/modules/kmod/templates/redhat.modprobe.erb in 0.00 seconds\",
0.070 | 34461: \"Debug: Scope(Kmod::Load[nf_conntrack_proto_sctp]): Retrieving template kmod/redhat.modprobe.erb\",
0.070 | 34461: \"Debug: importing '/etc/puppet/modules/sysctl/manifests/base.pp' in environment production\",
0.070 | 34461: \"Debug: Automatically imported sysctl::base from sysctl/base into production\",
0.070 | 34461: \"Debug: template[inline]: Interpolated template inline template in 0.07 seconds\",
0.070 | 34461: \"Debug: importing '/etc/puppet/modules/oslo/manifests/params.pp' in environment production\",
0.070 | 34461: \"Debug: Automatically imported oslo::params from oslo/params into production\",
0.070 | 34461: \"Debug: importing '/etc/puppet/modules/mysql/manifests/bindings.pp' in environment production\",
0.070 | 34461: \"Debug: Automatically imported mysql::bindings from mysql/bindings into production\",
0.070 | 34461: \"Debug: importing '/etc/puppet/modules/mysql/manifests/params.pp' in environment production\",
0.070 | 34461: \"Debug: Automatically imported mysql::params from mysql/params into production\",
0.070 | 34461: \"Debug: hiera(): Looking up mysql::bindings::install_options in JSON backend\",
0.070 | 34461: \"Debug: hiera(): Looking up mysql::bindings::java_enable in JSON backend\",
0.070 | 34461: \"Debug: hiera(): Looking up mysql::bindings::perl_enable in JSON backend\",
0.070 | 34461: \"Debug: hiera(): Looking up mysql::bindings::php_enable in JSON backend\",
0.070 | 34461: \"Debug: hiera(): Looking up mysql::bindings::python_enable in JSON backend\",
0.070 | 34461:
0.001 | 34462: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: hiera(): Looking up mysql::bindings::ruby_enable in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::client_dev in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::java_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::java_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::java_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::perl_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::perl_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::perl_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::php_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::php_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::php_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::python_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::python_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::python_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::ruby_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_ensure in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_name in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::client_dev_package_provider in JSON backend\",
0.001 | 34462: \"Debug: hiera(): Looking up mysql::bindings::daemon
0.204 | 34463: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: _dev_package_ensure in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev_package_name in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up mysql::bindings::daemon_dev_package_provider in JSON backend\",
0.204 | 34463: \"Debug: importing '/etc/puppet/modules/mysql/manifests/bindings/python.pp' in environment production\",
0.204 | 34463: \"Debug: Automatically imported mysql::bindings::python from mysql/bindings/python into production\",
0.204 | 34463: \"Debug: importing '/etc/puppet/modules/pacemaker/manifests/resource/systemd.pp' in environment production\",
0.204 | 34463: \"Debug: Automatically imported pacemaker::resource::systemd from pacemaker/resource/systemd into production\",
0.204 | 34463: \"Debug: Resource package[ceph-common] was not determined to be defined\",
0.204 | 34463: \"Debug: Create new resource package[ceph-common] with params {\\\"ensure\\\"=>\\\"present\\\", \\\"name\\\"=>\\\"ceph-common\\\", \\\"tag\\\"=>\\\"cinder-support-package\\\"}\",
0.204 | 34463: \"Debug: Resource file[/etc/sysconfig/openstack-cinder-volume] was not determined to be defined\",
0.204 | 34463: \"Debug: Create new resource file[/etc/sysconfig/openstack-cinder-volume] with params {\\\"ensure\\\"=>\\\"present\\\"}\",
0.204 | 34463: \"Debug: importing '/etc/puppet/modules/keystone/manifests/deps.pp' in environment production\",
0.204 | 34463: \"Debug: Automatically imported keystone::deps from keystone/deps into production\",
0.204 | 34463: \"Debug: importing '/etc/puppet/modules/oslo/manifests/cache.pp' in environment production\",
0.204 | 34463: \"Debug: Automatically imported oslo::cache from oslo/cache into production\",
0.204 | 34463: \"Debug: hiera(): Looking up tripleo.clustercheck.firewall_rules in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up tripleo.docker.firewall_rules in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up tripleo.kernel.firewall_rules in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up tripleo.keystone.firewall_rules in JSON backend\",
0.204 | 34463: \"Debug: hiera(): Looking up tripleo.glance_api.firewall_
0.001 | 34464: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.heat_api.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.heat_api_cfn.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.heat_engine.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.mysql.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.mysql_client.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_dhcp.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_l3.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_metadata.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_api.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_plugin_ml2.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.neutron_ovs_agent.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.rabbitmq.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.haproxy.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.memcached.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.pacemaker.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.nova_conductor.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.nova_api.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.nova_placement.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.nova_metadata.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.nova_scheduler.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.ntp.firewall_rules in JSON backend\",
0.001 | 34464: \"Debug: hiera(): Looking up tripleo.snmp.firewall_rules in

0.007 | 34561: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: g: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/before: subscribes to Sysctl[net.nf_conntrack_max]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[fs.inotify.max_user_instances]/Sysctl[fs.inotify.max_user_instances]/before: subscribes to Sysctl_runtime[fs.inotify.max_user_instances]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[fs.suid_dumpable]/Sysctl[fs.suid_dumpable]/before: subscribes to Sysctl_runtime[fs.suid_dumpable]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.dmesg_restrict]/Sysctl[kernel.dmesg_restrict]/before: subscribes to Sysctl_runtime[kernel.dmesg_restrict]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[kernel.pid_max]/Sysctl[kernel.pid_max]/before: subscribes to Sysctl_runtime[kernel.pid_max]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.core.netdev_max_backlog]/Sysctl[net.core.netdev_max_backlog]/before: subscribes to Sysctl_runtime[net.core.netdev_max_backlog]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.arp_accept]/Sysctl[net.ipv4.conf.all.arp_accept]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.arp_accept]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.log_martians]/Sysctl[net.ipv4.conf.all.log_martians]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.log_martians]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.secure_redirects]/Sysctl[net.ipv4.conf.all.secure_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.secure_redirects]\",
0.007 | 34561: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.all.send_redirects]/Sysctl[net.ipv4.conf.all.send_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.all.send_redirects]\",
0.007 | 34561:
0.008 | 34562: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.accept_redirects]/Sysctl[net.ipv4.conf.default.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.accept_redirects]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.log_martians]/Sysctl[net.ipv4.conf.default.log_martians]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.log_martians]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.secure_redirects]/Sysctl[net.ipv4.conf.default.secure_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.secure_redirects]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.conf.default.send_redirects]/Sysctl[net.ipv4.conf.default.send_redirects]/before: subscribes to Sysctl_runtime[net.ipv4.conf.default.send_redirects]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh1]/Sysctl[net.ipv4.neigh.default.gc_thresh1]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh1]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh2]/Sysctl[net.ipv4.neigh.default.gc_thresh2]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh2]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.neigh.default.gc_thresh3]/Sysctl[net.ipv4.neigh.default.gc_thresh3]/before: subscribes to Sysctl_runtime[net.ipv4.neigh.default.gc_thresh3]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_intvl]/Sysctl[net.ipv4.tcp_keepalive_intvl]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_keepalive_intvl]\",
0.008 | 34562: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_probes]/Sysctl[net.ipv4.tcp_keepalive_probes]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_ke
0.007 | 34563: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: epalive_probes]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv4.tcp_keepalive_time]/Sysctl[net.ipv4.tcp_keepalive_time]/before: subscribes to Sysctl_runtime[net.ipv4.tcp_keepalive_time]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_ra]/Sysctl[net.ipv6.conf.all.accept_ra]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.accept_ra]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.accept_redirects]/Sysctl[net.ipv6.conf.all.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.accept_redirects]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.autoconf]/Sysctl[net.ipv6.conf.all.autoconf]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.autoconf]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.all.disable_ipv6]/Sysctl[net.ipv6.conf.all.disable_ipv6]/before: subscribes to Sysctl_runtime[net.ipv6.conf.all.disable_ipv6]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_ra]/Sysctl[net.ipv6.conf.default.accept_ra]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.accept_ra]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.accept_redirects]/Sysctl[net.ipv6.conf.default.accept_redirects]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.accept_redirects]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.autoconf]/Sysctl[net.ipv6.conf.default.autoconf]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.autoconf]\",
0.007 | 34563: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.ipv6.conf.default.disable_ipv6]/Sysctl[net.ipv6.conf.default.disable_ipv6]/before: subscribes to Sysctl_runtime[net.ipv6.conf.default.disable_ipv6]\",
0.007 | 34563: \"Debug: /Stage[ma
0.253 | 34564: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: in]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.netfilter.nf_conntrack_max]/Sysctl[net.netfilter.nf_conntrack_max]/before: subscribes to Sysctl_runtime[net.netfilter.nf_conntrack_max]\",
0.253 | 34564: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Sysctl::Value[net.nf_conntrack_max]/Sysctl[net.nf_conntrack_max]/before: subscribes to Sysctl_runtime[net.nf_conntrack_max]\",
0.253 | 34564: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/require: subscribes to Package[snmpd]\",
0.253 | 34564: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/require: subscribes to File[var-net-snmp]\",
0.253 | 34564: \"Debug: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/before: subscribes to Service[snmpd]\",
0.253 | 34564: \"Debug: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/Concat_file[/etc/ssh/sshd_config]/before: subscribes to File[/etc/ssh/sshd_config]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.253 | 34564: \"Debug: /Stage[main]/C
0.018 | 34565: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: inder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_retries]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]/notify: subscribes to Anchor[cinder::config::end]\",
0.018 | 34565: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retries]/notify: subscribes to Anchor[cinder::config::end]\",

0.036 | 34633: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: Augeas[docker-sysconfig-network](provider=augeas): Will attempt to save and only run if files changed\",
0.036 | 34633: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): sending command 'rm' with params [\\\"/files/etc/sysconfig/docker-network/DOCKER_NETWORK_OPTIONS\\\"]\",
0.036 | 34633: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Skipping because no files were changed\",
0.036 | 34633: \"Debug: Augeas[docker-sysconfig-network](provider=augeas): Closed the augeas connection\",
0.036 | 34633: \"Debug: Executing: '/usr/bin/systemctl is-active docker'\",
0.036 | 34633: \"Debug: Executing: '/usr/bin/systemctl is-enabled docker'\",
0.036 | 34633: \"Debug: Exec[directory-create-etc-my.cnf.d](provider=posix): Executing check 'test -d /etc/my.cnf.d'\",
0.036 | 34633: \"Debug: Executing: 'test -d /etc/my.cnf.d'\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Opening augeas with root /, lens path , flags 64\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Augeas version 1.4.0 is installed\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Will attempt to save and only run if files changed\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'set' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/bind-address\\\", \\\"192.168.24.15\\\"]\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl\\\"]\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): sending command 'rm' with params [\\\"/files/etc/my.cnf.d/tripleo.cnf/tripleo/ssl-ca\\\"]\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Skipping because no files were changed\",
0.036 | 34633: \"Debug: Augeas[tripleo-mysql-client-conf](provider=augeas): Closed the augeas connection\",
0.036 | 34633: \"Debug: Executing: '/usr/bin/systemctl is-active pcsd'\",
0.036 | 34633: \"Debug: Executing: '/usr/bin/systemctl is-enabled pcsd'\",
0.036 | 34633:
0.028 | 34634: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: \"Debug: Exec[Create Cluster tripleo_cluster](provider=posix): Executing check '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/test -f /etc/corosync/corosync.conf'\",
0.028 | 34634: \"Debug: Exec[Start Cluster tripleo_cluster](provider=posix): Executing check '/sbin/pcs status >/dev/null 2>&1'\",
0.028 | 34634: \"Debug: Executing: '/sbin/pcs status >/dev/null 2>&1'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled corosync'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled pacemaker'\",
0.028 | 34634: \"Debug: Exec[wait-for-settle](provider=posix): Executing check '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.028 | 34634: \"Debug: Executing: '/sbin/pcs status | grep -q 'partition with quorum' > /dev/null 2>&1'\",
0.028 | 34634: \"Debug: defaults exists resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.028 | 34634: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-144lovy returned \",
0.028 | 34634: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-144lovy resource defaults | grep '^resource-stickiness: INFINITY$'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-active chronyd'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled chronyd'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-active ntpd'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled ntpd'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-active snmptrapd'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmptrapd'\",
0.028 | 34634: \"Debug: /Stage[main]/Tacker::Server/Tacker_config[DEFAULT/bind_port]: Nothing to manage: no ensure and the resource doesn't exist\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-active firewalld'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-enabled firewalld'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-active iptables'\",
0.028 | 34634: \"Debug: Executing: '/usr/bin/systemctl is-
0.009 | 34635: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: enabled iptables'\",
0.009 | 34635: \"Debug: Executing: '/usr/bin/systemctl is-active ip6tables'\",
0.009 | 34635: \"Debug: Executing: '/usr/bin/systemctl is-enabled ip6tables'\",
0.009 | 34635: \"Debug: Exec[modprobe nf_conntrack](provider=posix): Executing check 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.009 | 34635: \"Debug: Executing: 'egrep -q '^nf_conntrack ' /proc/modules'\",
0.009 | 34635: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing check 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.009 | 34635: \"Debug: Executing: 'egrep -q '^nf_conntrack_proto_sctp ' /proc/modules'\",
0.009 | 34635: \"Debug: Exec[modprobe nf_conntrack_proto_sctp](provider=posix): Executing 'modprobe nf_conntrack_proto_sctp'\",
0.009 | 34635: \"Debug: Executing: 'modprobe nf_conntrack_proto_sctp'\",
0.009 | 34635: \"Notice: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]/returns: executed successfully\",
0.009 | 34635: \"Debug: /Stage[main]/Tripleo::Profile::Base::Kernel/Kmod::Load[nf_conntrack_proto_sctp]/Exec[modprobe nf_conntrack_proto_sctp]: The container Kmod::Load[nf_conntrack_proto_sctp] will propagate my refresh event\",
0.009 | 34635: \"Debug: Kmod::Load[nf_conntrack_proto_sctp]: The container Class[Tripleo::Profile::Base::Kernel] will propagate my refresh event\",
0.009 | 34635: \"Debug: Prefetching parsed resources for sysctl\",
0.009 | 34635: \"Debug: Prefetching sysctl_runtime resources for sysctl_runtime\",
0.009 | 34635: \"Debug: Executing: '/usr/sbin/sysctl -a'\",
0.009 | 34635: \"Debug: Class[Tripleo::Profile::Base::Kernel]: The container Stage[main] will propagate my refresh event\",
0.009 | 34635: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1jb77z7 returned \",
0.009 | 34635: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1jb77z7 property show | grep stonith-enabled | grep false > /dev/null 2>&1\",
0.009 | 34635: \"Debug: property exists: property show | grep stonith-enabled | grep false > /dev/nul
0.344 | 34636: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: l 2>&1 -> \",
0.344 | 34636: \"Debug: Executing: '/usr/bin/systemctl is-active snmpd'\",
0.344 | 34636: \"Debug: Executing: '/usr/bin/systemctl is-enabled snmpd'\",
0.344 | 34636: \"Debug: Executing: '/usr/bin/systemctl is-active sshd'\",
0.344 | 34636: \"Debug: Executing: '/usr/bin/systemctl is-enabled sshd'\",
0.344 | 34636: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xq7458 returned \",
0.344 | 34636: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xq7458 property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.344 | 34636: \"Debug: property exists: property show | grep cinder-backup-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.344 | 34636: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/report_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/service_down_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/allow_availability_zone_fallback]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/image_conversion_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder/Cinder_config[DEFAULT/backend_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_num_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAULT/glance_api_ssl_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.344 | 34636: \"Debug: /Stage[main]/Cinder::Glance/Cinder_config[DEFAUL
0.337 | 34637: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: T/glance_request_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: Prefetching crontab resources for cron\",
0.337 | 34637: \"Debug: looking for crontabs in /var/spool/cron\",
0.337 | 34637: \"Notice: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]/ensure: created\",
0.337 | 34637: \"Debug: Flushing cron provider target cinder\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Cron::Db_purge/Cron[cinder-manage db purge]: The container Class[Cinder::Cron::Db_purge] will propagate my refresh event\",
0.337 | 34637: \"Debug: Class[Cinder::Cron::Db_purge]: The container Stage[main] will propagate my refresh event\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_manager]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_api_class]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Backup/Cinder_config[DEFAULT/backup_name_template]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Volume/Cinder_config[DEFAULT/volume_clear_ionice]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.337 | 34637: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/slave_connection]: Nothing to manage: no ensure and the resour
0.202 | 34638: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: ce doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.202 | 34638: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resour
0.217 | 34639: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: ce doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Db/Oslo::Db[cinder_config]/Cinder_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/watch_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.217 | 34639: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_d
0.213 | 34640: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: efault_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/instance_uuid_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder::Logging/Oslo::Log[cinder_config]/Cinder_config[DEFAULT/fatal_deprecations]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/amqp_durable_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/heartbeat_rate]: Nothing to manage: no ensure and the resource doesn't exist\",
0.213 | 34640: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kom
0.187 | 34641: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: bu_compression]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_failover_strategy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_missing_consumer_retry_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/kombu_reconnect_delay]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_interval_max]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_login_method]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_backoff]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_transient_queues_ttl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_virtual_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.187 | 34641: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_hosts]: Nothing to man
0.195 | 34642: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: age: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_qos_prefetch_count]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_host]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/rabbit_ha_queues]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Rabbit[cinder_config]/Cinder_config[oslo_messaging_rabbit/ssl_version]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/addressing_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/server_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/broadcast_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.195 | 34642: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp
0.189 | 34643: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: [cinder_config]/Cinder_config[oslo_messaging_amqp/group_request_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/rpc_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/notify_address_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/multicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/unicast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/anycast_address]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notification_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_rpc_exchange]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/pre_settled]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/container_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.189 | 34643: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/idle_timeout]: Nothing to manage: no ensure and the resour
0.185 | 34644: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: ce doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_ca_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_cert_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/ssl_key_password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/allow_insecure_clients]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_mechanisms]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_dir]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_config_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.185 | 34644: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/sasl_default_realm]: Nothing t
0.261 | 34645: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: o manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/username]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/password]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_send_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Amqp[cinder_config]/Cinder_config[oslo_messaging_amqp/default_notify_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Messaging::Default[cinder_config]/Cinder_config[DEFAULT/rpc_response_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder/Oslo::Concurrency[cinder_config]/Cinder_config[oslo_concurrency/disable_process_locking]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Cinder::Ceilometer/Oslo::Messaging::Notifications[cinder_config]/Cinder_config[oslo_messaging_notifications/topics]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_max_clone_depth]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_flatten_volume_from_snapshot]: Nothing to manage: no ensure and the resource doesn't exist\",
0.261 | 34645: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_secret_uuid]: Nothing to manage: no ensu
0.287 | 34646: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: re and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connect_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rados_connection_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tripleo::Profile::Base::Cinder::Volume::Rbd/Cinder::Backend::Rbd[tripleo_ceph]/Cinder_config[tripleo_ceph/rbd_store_chunk_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-h0z0aj returned \",
0.287 | 34646: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-h0z0aj property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1\",
0.287 | 34646: \"Debug: property exists: property show | grep cinder-volume-role | grep centos-7-rax-iad-0000787869 | grep true > /dev/null 2>&1 -> \",
0.287 | 34646: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_config_append]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/log_date_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.287 | 34646: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/watc
0.006 | 34647: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: h_log_file]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_syslog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_journal]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/syslog_log_facility]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/use_stderr]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_context_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_default_format_string]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_debug_format_suffix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_exception_prefix]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/logging_user_identity_format]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/default_log_levels]: Nothing to manage: no ensure and the resource doesn't exist\",
0.006 | 34647: \"Debug: /Stage[main]/Tacker::Logging/Oslo::Log[tacker_config]/Tacker_config[DEFAULT/publish_errors]: Nothing to manage: no ensure and the resource

0.004 | 34655: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: _retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/include_service_catalog]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/keyfile]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_conn_get_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_dead_retry]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_maxsize]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_socket_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_pool_unused_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_secret_key]: Nothing to manage: no ensure and the resource doesn't exist\",
0.004 | 34655: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_security_strate
0.013 | 34656: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: gy]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcache_use_advanced_pool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/memcached_servers]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/region_name]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/token_cache_time]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Keystone::Authtoken/Keystone::Resource::Authtoken[tacker_config]/Tacker_config[keystone_authtoken/insecure]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/sqlite_synchronous]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/backend]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/slave_connection]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/mysql_sql_mode]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/idle_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.013 | 34656: \"Debug: /Stage[main]/Tac
0.002 | 34657: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: ker::Db/Oslo::Db[tacker_config]/Tacker_config[database/min_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_pool_size]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/max_overflow]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_debug]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/connection_trace]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/pool_timeout]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_db_reconnect]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_inc_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retry_interval]: Nothing to manage: no ensure and the resource doesn't exist\",
0.002 | 34657: \"Debug: /Stage[mai
0.323 | 34658: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: n]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/db_max_retries]: Nothing to manage: no ensure and the resource doesn't exist\",
0.323 | 34658: \"Debug: /Stage[main]/Tacker::Db/Oslo::Db[tacker_config]/Tacker_config[database/use_tpool]: Nothing to manage: no ensure and the resource doesn't exist\",
0.323 | 34658: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-backup'\",
0.323 | 34658: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-cinder-volume'\",
0.323 | 34658: \"Debug: Executing: '/usr/bin/systemctl is-active openstack-tacker-server'\",
0.323 | 34658: \"Debug: Executing: '/usr/bin/systemctl is-enabled openstack-tacker-server'\",
0.323 | 34658: \"Debug: Prefetching iptables resources for firewall\",
0.323 | 34658: \"Debug: Puppet::Type::Firewall::ProviderIptables: [prefetch(resources)]\",
0.323 | 34658: \"Debug: Puppet::Type::Firewall::ProviderIptables: [instances]\",
0.323 | 34658: \"Debug: Executing: '/usr/sbin/iptables-save'\",
0.323 | 34658: \"Debug: Prefetching ip6tables resources for firewall\",
0.323 | 34658: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [prefetch(resources)]\",
0.323 | 34658: \"Debug: Puppet::Type::Firewall::ProviderIp6tables: [instances]\",
0.323 | 34658: \"Debug: Executing: '/usr/sbin/ip6tables-save'\",
0.323 | 34658: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1gzm9jb returned \",
0.323 | 34658: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1gzm9jb constraint list | grep location-openstack-cinder-backup > /dev/null 2>&1\",
0.323 | 34658: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xbogxx returned \",
0.323 | 34658: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1xbogxx resource show openstack-cinder-backup > /dev/null 2>&1\",
0.323 | 34658: \"Debug: Exists: resource exists false location exists false\",
0.323 | 34658: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-13qk7pu returned \",
0.105 | 34659: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config:
0.105 | 34659: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-13qk7pu constraint list | grep location-openstack-cinder-backup > /dev/null 2>&1\",
0.105 | 34659: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1g5ohgt returned \",
0.105 | 34659: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1g5ohgt resource show openstack-cinder-backup > /dev/null 2>&1\",
0.105 | 34659: \"Debug: Create: resource exists false location exists false\",
0.105 | 34659: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 returned \",
0.105 | 34659: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 resource create openstack-cinder-backup systemd:openstack-cinder-backup op start timeout=200s stop timeout=200s --disabled\",
0.105 | 34659: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-mlm1o2.orig returned 0 -> CIB updated\",
0.105 | 34659: \"Debug: location_rule_create: constraint location openstack-cinder-backup rule resource-discovery=exclusive score=0 cinder-backup-role eq true\",
0.105 | 34659: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e returned \",
0.105 | 34659: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e constraint location openstack-cinder-backup rule resource-discovery=exclusive score=0 cinder-backup-role eq true\",
0.105 | 34659: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11p3s5e.orig returned 0 -> CIB updated\",
0.105 | 34659: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 returned \",
0.105 | 34659: \"Debug:
0.239 | 34660: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 resource enable openstack-cinder-backup\",
0.239 | 34660: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1 diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-a51ft1.orig returned 0 -> CIB updated\",
0.239 | 34660: \"Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]/ensure: created\",
0.239 | 34660: \"Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Backup/Pacemaker::Resource::Service[openstack-cinder-backup]/Pacemaker::Resource::Systemd[openstack-cinder-backup]/Pcmk_resource[openstack-cinder-backup]: The container Pacemaker::Resource::Systemd[openstack-cinder-backup] will propagate my refresh event\",
0.239 | 34660: \"Debug: Pacemaker::Resource::Systemd[openstack-cinder-backup]: The container Pacemaker::Resource::Service[openstack-cinder-backup] will propagate my refresh event\",
0.239 | 34660: \"Debug: Pacemaker::Resource::Service[openstack-cinder-backup]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Backup] will propagate my refresh event\",
0.239 | 34660: \"Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Backup]: The container Stage[main] will propagate my refresh event\",
0.239 | 34660: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-870vjb returned \",
0.239 | 34660: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-870vjb constraint list | grep location-openstack-cinder-volume > /dev/null 2>&1\",
0.239 | 34660: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-31an3f returned \",
0.239 | 34660: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-31an3f resource show openstack-cinder-volume > /dev/null 2>&1\",
0.239 | 34660: \"Debug: backup_cib:
0.086 | 34661: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1to0ytg returned \",
0.086 | 34661: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1to0ytg constraint list | grep location-openstack-cinder-volume > /dev/null 2>&1\",
0.086 | 34661: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-7oghsb returned \",
0.086 | 34661: \"Debug: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-7oghsb resource show openstack-cinder-volume > /dev/null 2>&1\",
0.086 | 34661: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb returned \",
0.086 | 34661: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb resource create openstack-cinder-volume systemd:openstack-cinder-volume op start timeout=200s stop timeout=200s --disabled\",
0.086 | 34661: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-1fidyb.orig returned 0 -> CIB updated\",
0.086 | 34661: \"Debug: location_rule_create: constraint location openstack-cinder-volume rule resource-discovery=exclusive score=0 cinder-volume-role eq true\",
0.086 | 34661: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm returned \",
0.086 | 34661: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm constraint location openstack-cinder-volume rule resource-discovery=exclusive score=0 cinder-volume-role eq true\",
0.086 | 34661: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-lwn8hm.orig returned 0 -> CIB updated\",
0.086 | 34661: \"Debug: backup_cib: /usr/sbin/pcs cluster cib /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut returned
0.255 | 34662: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: \",
0.255 | 34662: \"Debug: try 1/10: /usr/sbin/pcs -f /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut resource enable openstack-cinder-volume\",
0.255 | 34662: \"Debug: push_cib: /usr/sbin/pcs cluster cib-push /var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut diff-against=/var/lib/pacemaker/cib/puppet-cib-backup20171108-119948-11le3ut.orig returned 0 -> CIB updated\",
0.255 | 34662: \"Notice: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]/ensure: created\",
0.255 | 34662: \"Debug: /Stage[main]/Tripleo::Profile::Pacemaker::Cinder::Volume/Pacemaker::Resource::Service[openstack-cinder-volume]/Pacemaker::Resource::Systemd[openstack-cinder-volume]/Pcmk_resource[openstack-cinder-volume]: The container Pacemaker::Resource::Systemd[openstack-cinder-volume] will propagate my refresh event\",
0.255 | 34662: \"Debug: Pacemaker::Resource::Systemd[openstack-cinder-volume]: The container Pacemaker::Resource::Service[openstack-cinder-volume] will propagate my refresh event\",
0.255 | 34662: \"Debug: Pacemaker::Resource::Service[openstack-cinder-volume]: The container Class[Tripleo::Profile::Pacemaker::Cinder::Volume] will propagate my refresh event\",
0.255 | 34662: \"Debug: Class[Tripleo::Profile::Pacemaker::Cinder::Volume]: The container Stage[main] will propagate my refresh event\",
0.255 | 34662: \"Debug: Finishing transaction 59572100\",
0.255 | 34662: \"Debug: Storing state\",
0.255 | 34662: \"Debug: Stored state in 0.07 seconds\",
0.255 | 34662: \"Notice: Applied catalog in 39.99 seconds\",
0.255 | 34662: \"Debug: Applying settings catalog for sections reporting, metrics\",
0.255 | 34662: \"Debug: Finishing transaction 107867560\",
0.255 | 34662: \"Debug: Received report to process from centos-7-rax-iad-0000787869.localdomain\",
0.255 | 34662: \"Debug: Processing report from centos-7-rax-iad-0000787869.localdomain with processor Puppet::Reports::Store\"
0.255 | 34662: ],
0.255 | 34662: \"failed_when_result\": false
0.255 | 34662: }
0.255 | 34662:
0.255 | 34662: TASK [R
0.000 | 34663: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: un docker-puppet tasks (generate config)] *******************************
0.000 | 34663: skipping: [localhost]
0.000 | 34663:
0.000 | 34663: TASK [debug] *******************************************************************
0.000 | 34663: ok: [localhost] => {
0.000 | 34663: \"(outputs.stderr|default('')).split('\
0.000 | 34663: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 34663: \"\"
0.000 | 34663: ],
0.000 | 34663: \"failed_when_result\": false
0.000 | 34663: }
0.000 | 34663:
0.000 | 34663: TASK [Check if /var/lib/hashed-tripleo-config/docker-container-startup-config-step_5.json exists] ***
0.000 | 34663: ok: [localhost]
0.000 | 34663:
0.000 | 34663: TASK [Start containers for step 5] *********************************************
0.000 | 34663: ok: [localhost]
0.000 | 34663:
0.000 | 34663: TASK [debug] *******************************************************************
0.000 | 34663: ok: [localhost] => {
0.000 | 34663: \"(outputs.stderr|default('')).split('\
0.000 | 34663: ')|union(outputs.stdout_lines|default([]))\": [
0.000 | 34663: \"stdout: 0e6617e3d7c758431a965971ff733027296d33bb028a062c0addfe76f3545538\",
0.000 | 34663: \"\",
0.000 | 34663: \"stderr: Unable to find image '192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd:3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d' locally\",
0.000 | 34663: \"Trying to pull repository 192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd ... \",
0.000 | 34663: \"3b85715c6488a692fcdf4e4c6a589d77c60eb5a6_a3792c8d: Pulling from 192.168.24.1:8787/tripleomaster/centos-binary-gnocchi-metricd\",
0.000 | 34663: \"d9aaf4d82f24: Already exists\",
0.000 | 34663: \"615fb2b6a1f1: Already exists\",
0.000 | 34663: \"3013007117c8: Already exists\",
0.000 | 34663: \"72133c850d33: Already exists\",
0.000 | 34663: \"c2baf92c99f8: Already exists\",
0.000 | 34663: \"c33a905d0cfb: Already exists\",
0.000 | 34663: \"0e2281a8f625: Already exists\",
0.000 | 34663: \"8db9532c7c2a: Already exists\",
0.000 | 34663: \"a2fdf405ce12: Already exists\",
0.000 | 34663: \"b4d23af701db: Already exists\",
0.000 | 34663: \"c0364d012ec6: Already exists\",
0.000 | 34663: \"5da3106f315c: Already exists\",
0.000 | 34663: \"7115c908a774: Already exists\",
0.000 | 34663: \"6bfb3cfd80b3: Already exists\",
0.000 | 34663: \"8d6928a9593d: Already exists\",
0.000 | 34663: \"26bc5dc8da6d: Already exists\",
0.000 | 34663: \"76a6f33737df: Already exists\", \

0.000 | 35449: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_insecure in JSON backend",
0.000 | 35450: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_api_ssl_compression in JSON backend",
0.000 | 35451: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::glance::glance_request_timeout in JSON backend",
0.324 | 35452: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/cinder/manifests/cron/db_purge.pp' in environment production",
0.219 | 35453: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported cinder::cron::db_purge from cinder/cron/db_purge into production",
0.228 | 35454: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::minute in JSON backend",
0.203 | 35455: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::hour in JSON backend",
0.228 | 35456: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::monthday in JSON backend",
0.228 | 35457: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::month in JSON backend",
0.228 | 35458: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::weekday in JSON backend",
0.115 | 35459: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: hiera(): Looking up cinder::cron::db_purge::user in JSON backend",

0.000 | 35760: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/mysql/manifests/bindings/python.pp' in environment production",
0.000 | 35761: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported mysql::bindings::python from mysql/bindings/python into production",
0.146 | 35762: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: importing '/etc/puppet/modules/pacemaker/manifests/resource/systemd.pp' in environment production",
0.201 | 35763: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Automatically imported pacemaker::resource::systemd from pacemaker/resource/systemd into production",
0.000 | 35764: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Resource package[ceph-common] was not determined to be defined",

0.000 | 35901: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Service[pacemaker] to Exec[wait-for-settle] with 'before'",
0.000 | 35902: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from File[etc-pacemaker] to File[etc-pacemaker-authkey] with 'before'",
0.000 | 35903: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from File[etc-pacemaker-authkey] to Exec[Create Cluster tripleo_cluster] with 'before'",
0.315 | 35904: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_resource[openstack-cinder-backup] with 'before'",
0.256 | 35905: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_resource[openstack-cinder-volume] with 'before'",
0.000 | 35906: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: Adding relationship from Exec[wait-for-settle] to Pcmk_property[property--stonith-enabled] with 'before'",

0.000 | 36780: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[Start Cluster tripleo_cluster]/before: subscribes to Service[pacemaker]",
0.000 | 36781: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker]/before: subscribes to File[etc-pacemaker-authkey]",
0.000 | 36782: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/File[etc-pacemaker-authkey]/before: subscribes to Exec[Create Cluster tripleo_cluster]",
0.302 | 36783: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_resource[openstack-cinder-backup]",
0.217 | 36784: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_resource[openstack-cinder-volume]",
0.000 | 36785: Nov 8 21:09:39 centos-7-rax-iad-0000787869 os-collect-config: "Debug: /Stage[main]/Pacemaker::Corosync/Exec[wait-for-settle]/before: subscribes to Pcmk_property[property--stonith-enabled]",

logs/syslog.txt.gz
0.000 | 0506: Nov 08 21:31:58 centos-7-rax-iad-0000787514 sudo[23396]: zuul : TTY=unknown ; PWD=/opt/stack/new/tripleo-quickstart ; USER=root ; COMMAND=/bin/mkdir -p /opt/stack/new
0.000 | 0507: Nov 08 21:31:58 centos-7-rax-iad-0000787514 sudo[23398]: zuul : TTY=unknown ; PWD=/opt/stack/new/tripleo-quickstart ; USER=root ; COMMAND=/bin/cp -Rf /home/zuul/workspace/logs/undercloud/home/zuul/tempest /opt/stack/new
0.000 | 0508: Nov 08 21:31:58 centos-7-rax-iad-0000787514 sudo[23400]: zuul : TTY=unknown ; PWD=/opt/stack/new/tripleo-quickstart ; USER=root ; COMMAND=/bin/gzip -d -r /opt/stack/new/tempest/.testrepository
0.209 | 0509: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23449]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/opt/stack/new/devstack/tools/worlddump.py -d /opt/stack/logs
0.037 | 0510: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23460]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/ip netns exec qdhcp-35ab1a0b-e5c9-4f4a-b561-f643af1fffc8 ip addr

0.168 | 0515: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23471]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 show br-ctlplane
0.000 | 0516: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23473]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 show br-ex
0.000 | 0517: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23475]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 show br-int
0.240 | 0518: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23477]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports-desc br-ctlplane
0.000 | 0519: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23479]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports-desc br-ex
0.000 | 0520: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23481]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports-desc br-int
0.240 | 0521: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23483]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports br-ctlplane
0.000 | 0522: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23485]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports br-ex
0.000 | 0523: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23487]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-ports br-int
0.240 | 0524: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23489]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-flows br-ctlplane
0.000 | 0525: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23491]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-flows br-ex
0.000 | 0526: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23493]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/bin/ovs-ofctl --protocols=OpenFlow10,OpenFlow11,OpenFlow12,OpenFlow13 dump-flows br-int
0.309 | 0527: Nov 08 21:32:19 centos-7-rax-iad-0000787514 sudo[23495]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/iptables --line-numbers -L -nv -t filter
0.309 | 0528: Nov 08 21:32:20 centos-7-rax-iad-0000787514 sudo[23497]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/iptables --line-numbers -L -nv -t nat
0.309 | 0529: Nov 08 21:32:20 centos-7-rax-iad-0000787514 sudo[23499]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/iptables --line-numbers -L -nv -t mangle
0.330 | 0530: Nov 08 21:32:20 centos-7-rax-iad-0000787514 sudo[23501]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/ebtables -t filter -L
0.279 | 0531: Nov 08 21:32:20 centos-7-rax-iad-0000787514 kernel: Ebtables v2.0 registered
0.330 | 0532: Nov 08 21:32:20 centos-7-rax-iad-0000787514 sudo[23508]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/ebtables -t nat -L
0.000 | 0533: Nov 08 21:32:20 centos-7-rax-iad-0000787514 sudo[23512]: zuul : TTY=unknown ; PWD=/home/zuul/workspace ; USER=root ; COMMAND=/sbin/ebtables -t broute -L

logs/subnode-2/var/log/extra/docker/containers/cinder_scheduler/docker_info.log.txt.gz
0.000 | 0009: KiB Swap: 7999020 total, 7840888 free, 158132 used. 1334080 avail Mem
0.000 | 0010:
0.000 | 0011: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
0.433 | 0012: 1 cinder 20 0 188 0 0 S 0.0 0.0 0:00.03 kolla_start
0.170 | 0013: 9 cinder 20 0 797560 119632 9356 S 0.0 1.5 0:07.75 cinder-schedule

logs/subnode-2/etc/puppet/hieradata/service_names.json.txt.gz
0.000 | 0001: "sensu::subscriptions": [
0.000 | 0002: "overcloud-pacemaker",
0.293 | 0003: "overcloud-cinder-backup",
0.293 | 0004: "overcloud-cinder-volume"
0.000 | 0005: ],

logs/subnode-2/var/log/extra/docker/containers/galera-bundle-docker-0/docker_info.log.txt.gz
0.000 | 0012: KiB Swap: 7999020 total, 7824720 free, 174300 used. 1356716 avail Mem
0.000 | 0013:
0.000 | 0014: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
0.582 | 0015: 14599 root 20 0 12032 1964 1276 S 60.0 0.0 0:00.34 galera
0.000 | 0016: 1 root 20 0 75500 1968 1824 S 0.0 0.0 0:00.05 pacemake+

logs/delorean_logs/a7/7d/a77d26728da9220ffdb5b95f75ac5cd6cb56b0b0_dev/root.log.txt.gz
0.000 | 2976: DEBUG util.py:417: hard linking environments/predictable-placement/custom-hostnames.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/predictable-placement
0.000 | 2977: DEBUG util.py:417: hard linking environments/services/barbican.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.116 | 2978: DEBUG util.py:417: hard linking environments/services/ceilometer-api.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.258 | 2979: DEBUG util.py:417: hard linking environments/services/ceilometer-collector.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.258 | 2980: DEBUG util.py:417: hard linking environments/services/ceilometer-expirer.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.000 | 2981: DEBUG util.py:417: hard linking environments/services/ceph-mds.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services

logs/delorean_logs/a7/7d/a77d26728da9220ffdb5b95f75ac5cd6cb56b0b0_dev/rpmbuild.log.txt.gz
0.000 | 3542: hard linking environments/predictable-placement/custom-hostnames.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/predictable-placement
0.000 | 3543: hard linking environments/services/barbican.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.106 | 3544: hard linking environments/services/ceilometer-api.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.246 | 3545: hard linking environments/services/ceilometer-collector.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.240 | 3546: hard linking environments/services/ceilometer-expirer.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services
0.000 | 3547: hard linking environments/services/ceph-mds.yaml -> tripleo-heat-templates-8.0.0.0b2.dev144/environments/services

logs/subnode-2/pip2-freeze.txt.gz
0.000 | 0034: futures==3.0.3
0.000 | 0035: futurist==1.4.0
0.000 | 0036: glean==1.10.1
0.308 | 0037: google-api-python-client==1.4.2
0.000 | 0038: greenlet==0.4.9

0.000 | 0088: oslo.messaging==5.33.1
0.000 | 0089: oslo.middleware==3.32.1
0.000 | 0090: oslo.policy==1.29.0
0.400 | 0091: oslo.privsep==1.23.0
0.000 | 0092: oslo.reports==1.24.0

0.000 | 0094: oslo.serialization==2.21.1
0.000 | 0095: oslo.service==1.26.0
0.000 | 0096: oslo.utils==3.30.0
0.400 | 0097: oslo.versionedobjects==1.29.0
0.000 | 0098: osprofiler==1.13.0

0.000 | 0129: PySocks==1.5.6
0.000 | 0130: pystache==0.5.3
0.000 | 0131: python-barbicanclient==4.5.2
0.574 | 0132: python-cinderclient==3.2.0
0.000 | 0133: python-dateutil==2.4.2
0.000 | 0134: python-editor==0.4
0.574 | 0135: python-gflags==2.0
0.000 | 0136: python-glanceclient==2.8.0

logs/subnode-2/var/log/extra/docker/containers/nova_virtlogd/stdout.log.txt.gz
0.000 | 0022: INFO:__main__:Deleting /etc/issue
0.000 | 0023: INFO:__main__:Copying /var/lib/kolla/config_files/src/etc/issue to /etc/issue
0.000 | 0024: INFO:__main__:Writing out command to execute
0.490 | 0025: INFO:__main__:Setting permission for /var/log/nova
0.000 | 0026: ++ cat /run_command

logs/subnode-2/var/log/config-data/sensu/etc/sensu/conf.d/client.json.txt.gz
0.000 | 0038: },
0.000 | 0039: "subscriptions": [
0.000 | 0040: "overcloud-pacemaker",
0.293 | 0041: "overcloud-cinder-backup",
0.293 | 0042: "overcloud-cinder-volume"
0.000 | 0043: ]

logs/subnode-2/var/log/extra/pip.txt.gz
0.000 | 0035: futures (3.0.3)
0.000 | 0036: futurist (1.4.0)
0.000 | 0037: glean (1.10.1)
0.309 | 0038: google-api-python-client (1.4.2)
0.000 | 0039: greenlet (0.4.9)

0.000 | 0134: python-cinderclient (3.2.0)
0.000 | 0135: python-dateutil (2.4.2)
0.000 | 0136: python-editor (0.4)
0.631 | 0137: python-gflags (2.0)
0.000 | 0138: python-glanceclient (2.8.0)

logs/subnode-2/var/log/cluster/corosync.log.txt.gz
0.000 | 3315: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.40.0 2
0.000 | 3316: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.41.0 (null)
0.000 | 3317: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @epoch=41
0.514 | 3318: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ /cib/configuration/resources: <primitive class="systemd" id="openstack-cinder-backup" type="openstack-cinder-backup"/>
0.209 | 3319: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <meta_attributes id="openstack-cinder-backup-meta_attributes">
0.103 | 3320: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <nvpair id="openstack-cinder-backup-meta_attributes-target-role" name="target-role" value="Stopped"/>

0.000 | 3350: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: ip-192.168.24.13 (ocf::heartbeat:IPaddr2): Started centos-7-rax-iad-0000787869
0.000 | 3351: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: container_print: Docker container: haproxy-bundle [192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest]
0.000 | 3352: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: haproxy-bundle-docker-0 (ocf::heartbeat:docker): Started centos-7-rax-iad-0000787869
0.497 | 3353: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-backup (systemd:openstack-cinder-backup): Stopped (disabled)
0.000 | 3354: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting galera:0 (Master galera-bundle-0)

0.000 | 3368: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.14 (Started centos-7-rax-iad-0000787869)
0.000 | 3369: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.13 (Started centos-7-rax-iad-0000787869)
0.000 | 3370: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave haproxy-bundle-docker-0 (Started centos-7-rax-iad-0000787869)
0.323 | 3371: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave openstack-cinder-backup (Stopped)
0.000 | 3372: Nov 08 21:08:43 [12541] centos-7-rax-iad-0000787869 pengine: notice: process_pe_message: Calculated transition 41, saving inputs in /var/lib/pacemaker/pengine/pe-input-40.bz2

0.000 | 3381: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.41.0 2
0.000 | 3382: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.41.1 (null)
0.000 | 3383: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @num_updates=1
0.426 | 3384: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="openstack-cinder-backup" type="openstack-cinder-backup" class="systemd"/>
0.120 | 3385: Nov 08 21:08:43 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <lrm_rsc_op id="openstack-cinder-backup_last_0" operation_key="openstack-cinder-backup_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.12" transition-key="14:41:7:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f" transition-magic="0:7;14:41:7:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f" on_node="centos-7-rax-iad-0000787869" call-i

0.000 | 3393: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.41.1 2
0.000 | 3394: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.42.0 (null)
0.000 | 3395: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @epoch=42, @num_updates=0
0.268 | 3396: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ /cib/configuration/constraints: <rsc_location id="location-openstack-cinder-backup" resource-discovery="exclusive" rsc="openstack-cinder-backup"/>
0.132 | 3397: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <rule id="location-openstack-cinder-backup-rule" score="0">
0.290 | 3398: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <expression attribute="cinder-backup-role" id="location-openstack-cinder-backup-rule-expr" operation="eq" value="true"/>
0.000 | 3399: Nov 08 21:08:45 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ </rule>

0.000 | 3453: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_process_request: Forwarding cib_apply_diff operation for section 'all' to all (origin=local/cibadmin/2)
0.000 | 3454: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.42.0 2
0.000 | 3455: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.43.0 (null)
0.358 | 3456: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: -- /cib/configuration/resources/primitive[@id='openstack-cinder-backup']/meta_attributes[@id='openstack-cinder-backup-meta_attributes']
0.000 | 3457: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @epoch=43
0.000 | 3458: Nov 08 21:08:47 [12537] centos-7-rax-iad-0000787869 cib: info: cib_process_request: Completed cib_apply_diff operation for section 'all': OK (rc=0, origin=centos-7-rax-iad-0000787869/cibadmin/2, version=0.43.0)
0.256 | 3459: Nov 08 21:08:47 [12542] centos-7-rax-iad-0000787869 crmd: info: abort_transition_graph: Transition aborted by deletion of meta_attributes[@id='openstack-cinder-backup-meta_attributes']: Configuration change | cib=0.43.0 source=te_update_diff:456 path=/cib/configuration/resources/primitive[@id='openstack-cinder-backup']/meta_attributes[@id='openstack-cinder-backup-meta_attributes'] complete=true
0.000 | 3460: Nov 08 21:08:47 [12542] centos-7-rax-iad-0000787869 crmd: notice: do_state_transition: State transition S_IDLE -> S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph

0.000 | 3480: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: ip-192.168.24.13 (ocf::heartbeat:IPaddr2): Started centos-7-rax-iad-0000787869
0.000 | 3481: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: container_print: Docker container: haproxy-bundle [192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest]
0.000 | 3482: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: haproxy-bundle-docker-0 (ocf::heartbeat:docker): Started centos-7-rax-iad-0000787869
0.552 | 3483: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-backup (systemd:openstack-cinder-backup): Stopped
0.000 | 3484: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting galera:0 (Master galera-bundle-0)
0.000 | 3485: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: galera-bundle-master: Promoted 1 instances of a possible 1 to master
0.000 | 3486: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting redis:0 (Master redis-bundle-0)
0.000 | 3487: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: redis-bundle-master: Promoted 1 instances of a possible 1 to master
0.250 | 3488: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: RecurringOp: Start recurring monitor (60s) for openstack-cinder-backup on centos-7-rax-iad-0000787869
0.000 | 3489: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave rabbitmq-bundle-docker-0 (Started centos-7-rax-iad-0000787869)

0.000 | 3498: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.14 (Started centos-7-rax-iad-0000787869)
0.000 | 3499: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.13 (Started centos-7-rax-iad-0000787869)
0.000 | 3500: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave haproxy-bundle-docker-0 (Started centos-7-rax-iad-0000787869)
0.276 | 3501: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: notice: LogAction: * Start openstack-cinder-backup ( centos-7-rax-iad-0000787869 )
0.000 | 3502: Nov 08 21:08:47 [12541] centos-7-rax-iad-0000787869 pengine: notice: process_pe_message: Calculated transition 43, saving inputs in /var/lib/pacemaker/pengine/pe-input-42.bz2

0.065 | 3505: Nov 08 21:08:47 [12542] centos-7-rax-iad-0000787869 crmd: notice: te_rsc_command: Initiating start operation openstack-cinder-backup_start_0 locally on centos-7-rax-iad-0000787869 | action 110
0.114 | 3506: Nov 08 21:08:47 [12542] centos-7-rax-iad-0000787869 crmd: info: do_lrm_rsc_op: Performing key=110:43:0:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f op=openstack-cinder-backup_start_0
0.168 | 3507: Nov 08 21:08:47 [12539] centos-7-rax-iad-0000787869 lrmd: info: log_execute: executing - rsc:openstack-cinder-backup action:start call_id:57
0.541 | 3508: Nov 08 21:08:47 [12539] centos-7-rax-iad-0000787869 lrmd: info: systemd_exec_result: Call to start passed: /org/freedesktop/systemd1/job/3127
0.202 | 3509: Nov 08 21:08:50 [12542] centos-7-rax-iad-0000787869 crmd: notice: process_lrm_event: Result of start operation for openstack-cinder-backup on centos-7-rax-iad-0000787869: 0 (ok) | call=57 key=openstack-cinder-backup_start_0 confirmed=true cib-update=473
0.000 | 3510: Nov 08 21:08:50 [12537] centos-7-rax-iad-0000787869 cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/473)

0.096 | 3516: Nov 08 21:08:50 [12542] centos-7-rax-iad-0000787869 crmd: info: match_graph_event: Action openstack-cinder-backup_start_0 (110) confirmed on centos-7-rax-iad-0000787869 (rc=0)
0.000 | 3517: Nov 08 21:08:50 [12542] centos-7-rax-iad-0000787869 crmd: notice: te_rsc_command: Initiating monitor operation openstack-cinder-backup_monitor_60000 locally on centos-7-rax-iad-0000787869 | action 111
0.000 | 3518: Nov 08 21:08:50 [12542] centos-7-rax-iad-0000787869 crmd: info: do_lrm_rsc_op: Performing key=111:43:0:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f op=openstack-cinder-backup_monitor_60000
0.207 | 3519: Nov 08 21:08:50 [12542] centos-7-rax-iad-0000787869 crmd: info: process_lrm_event: Result of monitor operation for openstack-cinder-backup on centos-7-rax-iad-0000787869: 0 (ok) | call=58 key=openstack-cinder-backup_monitor_60000 confirmed=false cib-update=474
0.000 | 3520: Nov 08 21:08:50 [12537] centos-7-rax-iad-0000787869 cib: info: cib_process_request: Forwarding cib_modify operation for section status to all (origin=local/crmd/474)

0.000 | 3531: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.43.2 2
0.000 | 3532: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.44.0 (null)
0.000 | 3533: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @epoch=44, @num_updates=0
0.257 | 3534: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ /cib/configuration/resources: <primitive class="systemd" id="openstack-cinder-volume" type="openstack-cinder-volume"/>
0.000 | 3535: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <meta_attributes id="openstack-cinder-volume-meta_attributes">

0.000 | 3566: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: ip-192.168.24.13 (ocf::heartbeat:IPaddr2): Started centos-7-rax-iad-0000787869
0.000 | 3567: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: container_print: Docker container: haproxy-bundle [192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest]
0.000 | 3568: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: haproxy-bundle-docker-0 (ocf::heartbeat:docker): Started centos-7-rax-iad-0000787869
0.571 | 3569: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-backup (systemd:openstack-cinder-backup): Started centos-7-rax-iad-0000787869
0.248 | 3570: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-volume (systemd:openstack-cinder-volume): Stopped (disabled)
0.000 | 3571: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting galera:0 (Master galera-bundle-0)

0.000 | 3585: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.14 (Started centos-7-rax-iad-0000787869)
0.000 | 3586: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave ip-192.168.24.13 (Started centos-7-rax-iad-0000787869)
0.000 | 3587: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave haproxy-bundle-docker-0 (Started centos-7-rax-iad-0000787869)
0.350 | 3588: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave openstack-cinder-backup (Started centos-7-rax-iad-0000787869)
0.032 | 3589: Nov 08 21:08:54 [12541] centos-7-rax-iad-0000787869 pengine: info: LogActions: Leave openstack-cinder-volume (Stopped)

0.000 | 3599: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: --- 0.44.0 2
0.000 | 3600: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: Diff: +++ 0.44.1 (null)
0.000 | 3601: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: + /cib: @num_updates=1
0.234 | 3602: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ /cib/status/node_state[@id='1']/lrm[@id='1']/lrm_resources: <lrm_resource id="openstack-cinder-volume" type="openstack-cinder-volume" class="systemd"/>
0.120 | 3603: Nov 08 21:08:54 [12537] centos-7-rax-iad-0000787869 cib: info: cib_perform_op: ++ <lrm_rsc_op id="openstack-cinder-volume_last_0" operation_key="openstack-cinder-volume_monitor_0" operation="monitor" crm-debug-origin="do_update_resource" crm_feature_set="3.0.12" transition-key="15:44:7:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f" transition-magic="0:7;15:44:7:fb774f5a-3e3c-4b5a-b425-d3d6a6be210f" on_node="centos-7-rax-iad-0000787869" call-i

0.000 | 3699: Nov 08 21:08:59 [12541] centos-7-rax-iad-0000787869 pengine: info: container_print: Docker container: haproxy-bundle [192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest]
0.000 | 3700: Nov 08 21:08:59 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: haproxy-bundle-docker-0 (ocf::heartbeat:docker): Started centos-7-rax-iad-0000787869
0.000 | 3701: Nov 08 21:08:59 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-backup (systemd:openstack-cinder-backup): Started centos-7-rax-iad-0000787869
0.269 | 3702: Nov 08 21:08:59 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-volume (systemd:openstack-cinder-volume): Stopped
0.000 | 3703: Nov 08 21:08:59 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting galera:0 (Master galera-bundle-0)

0.000 | 4007: Nov 08 21:24:01 [12541] centos-7-rax-iad-0000787869 pengine: info: container_print: Docker container: haproxy-bundle [192.168.24.1:8787/tripleomaster/centos-binary-haproxy:pcmklatest]
0.000 | 4008: Nov 08 21:24:01 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: haproxy-bundle-docker-0 (ocf::heartbeat:docker): Started centos-7-rax-iad-0000787869
0.000 | 4009: Nov 08 21:24:01 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-backup (systemd:openstack-cinder-backup): Started centos-7-rax-iad-0000787869
0.278 | 4010: Nov 08 21:24:01 [12541] centos-7-rax-iad-0000787869 pengine: info: common_print: openstack-cinder-volume (systemd:openstack-cinder-volume): Started centos-7-rax-iad-0000787869
0.000 | 4011: Nov 08 21:24:01 [12541] centos-7-rax-iad-0000787869 pengine: info: master_color: Promoting galera:0 (Master galera-bundle-0)

logs/subnode-2/var/log/extra/docker/containers/nova_compute/stdout.log.txt.gz
0.000 | 0027: INFO:__main__:Copying /var/lib/kolla/config_files/src-iscsid/etc/iscsi/initiatorname.iscsi to /etc/iscsi/initiatorname.iscsi
0.000 | 0028: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
0.000 | 0029: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/ceph.client.admin.keyring to /etc/ceph/ceph.client.admin.keyring
0.340 | 0030: INFO:__main__:Deleting /etc/ceph/rbdmap
0.104 | 0031: INFO:__main__:Copying /var/lib/kolla/config_files/src-ceph/rbdmap to /etc/ceph/rbdmap

logs/undercloud/var/log/secure.txt.gz
0.000 | 1096: Nov 8 19:39:48 centos-7-rax-iad-0000787514 sshd[21123]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.89.90.28
0.000 | 1097: Nov 8 19:39:50 centos-7-rax-iad-0000787514 sshd[21123]: Failed password for invalid user test from 103.89.90.28 port 63350 ssh2
0.000 | 1098: Nov 8 19:39:53 centos-7-rax-iad-0000787514 sshd[21123]: Connection reset by 103.89.90.28 port 63350 [preauth]
0.236 | 1099: Nov 8 19:40:05 centos-7-rax-iad-0000787514 unix_chkpwd[21127]: password check failed for user (nobody)
0.136 | 1100: Nov 8 19:40:05 centos-7-rax-iad-0000787514 sshd[21125]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.89.90.28 user=nobody
0.358 | 1101: Nov 8 19:40:06 centos-7-rax-iad-0000787514 sshd[21125]: Failed password for nobody from 103.89.90.28 port 63044 ssh2
0.000 | 1102: Nov 8 19:40:07 centos-7-rax-iad-0000787514 sshd[21125]: Connection reset by 103.89.90.28 port 63044 [preauth]

0.000 | 2555: Nov 8 21:28:22 centos-7-rax-iad-0000787514 sudo: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgeplrnglzdgcexqpdtctrvceytgcnpf; /usr/bin/python
0.000 | 2556: Nov 8 21:28:53 centos-7-rax-iad-0000787514 sshd[21577]: pam_unix(sshd:auth): check pass; user unknown
0.000 | 2557: Nov 8 21:28:53 centos-7-rax-iad-0000787514 sshd[21577]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=serversiro.tk
0.214 | 2558: Nov 8 21:29:26 centos-7-rax-iad-0000787514 unix_chkpwd[21591]: password check failed for user (apache)
0.121 | 2559: Nov 8 21:29:26 centos-7-rax-iad-0000787514 sshd[21589]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=serversiro.tk user=apache

logs/undercloud/home/zuul/tempest-setup.sh.txt.gz
0.000 | 0093: ${TEMPESTCONF} --out etc/tempest.conf \
0.000 | 0094: --network-id $public_net_id \
0.000 | 0095: --deployer-input ~/tempest-deployer-input.conf \
0.441 | 0096: --image http://download.cirros-cloud.net/0.3.5/cirros-0.3.5-x86_64-disk.img \
0.000 | 0097: --debug \

logs/subnode-2/var/log/config-data/neutron/etc/neutron/plugins/ml2/ml2_conf.ini.txt.gz
0.000 | 0258:
0.000 | 0259: # Driver for security groups firewall in the L2 agent (string value)
0.000 | 0260: #firewall_driver = <None>
0.427 | 0261: firewall_driver=openvswitch
0.000 | 0262:

logs/subnode-2/var/log/extra/docker/containers/neutron_ovs_agent/docker_info.log.txt.gz
0.000 | 0013:
0.000 | 0014: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
0.000 | 0015: 205990 rabbitmq 20 0 969140 29584 2856 S 105.9 0.4 0:00.48 beam.smp
0.311 | 0016: 206147 root 20 0 111188 15776 4192 R 82.4 0.2 0:00.14 ceilometer-root
0.000 | 0017: 206060 root 20 0 85776 2728 2072 S 11.8 0.0 0:00.34 sudo

0.000 | 0484: 202413 root 20 0 11596 1496 1280 S 0.0 0.0 0:00.03 sh
0.000 | 0485: 205886 ceilome+ 20 0 435552 53068 1656 S 0.0 0.6 0:00.01 ceilometer-poll
0.000 | 0486: 205961 root 20 0 11900 1808 1268 S 0.0 0.0 0:00.30 rabbitmq-cluste
0.534 | 0487: 205978 root 20 0 11636 1452 1232 S 0.0 0.0 0:00.00 rabbitmqctl
0.000 | 0488: 205989 root 20 0 79800 2088 1556 S 0.0 0.0 0:00.00 su

logs/subnode-2/var/log/extra/rpm-list.txt.gz
0.000 | 0017: bind-utils-9.9.4-51.el7.x86_64
0.000 | 0018: binutils-2.25.1-32.base.el7_4.1.x86_64
0.000 | 0019: boost-iostreams-1.53.0-27.el7.x86_64
0.368 | 0020: boost-program-options-1.53.0-27.el7.x86_64
0.000 | 0021: boost-random-1.53.0-27.el7.x86_64
0.368 | 0022: boost-regex-1.53.0-27.el7.x86_64
0.000 | 0023: boost-system-1.53.0-27.el7.x86_64

0.000 | 0029: centos-release-qemu-ev-1.0-2.el7.noarch
0.000 | 0030: centos-release-storage-common-1-2.el7.centos.noarch
0.000 | 0031: centos-release-virt-common-1-1.el7.centos.noarch
0.275 | 0032: ceph-common-10.2.7-0.el7.x86_64
0.000 | 0033: checkpolicy-2.5-4.el7.x86_64

logs/subnode-2/var/log/extra/docker/containers/aodh_api/log/httpd/aodh_wsgi_access.log.txt.gz
0.000 | 0218: 192.168.24.3 - - [08/Nov/2017:21:13:59 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.000 | 0219: 192.168.24.3 - - [08/Nov/2017:21:14:01 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.000 | 0220: 192.168.24.3 - - [08/Nov/2017:21:14:03 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.417 | 0221: 192.168.24.3 - - [08/Nov/2017:21:14:04 +0000] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.4.6 (CentOS) OpenSSL/1.0.2k-fips mod_wsgi/3.4 Python/2.7.5 (internal dummy connection)"
0.000 | 0222: 192.168.24.3 - - [08/Nov/2017:21:14:05 +0000] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.4.6 (CentOS) OpenSSL/1.0.2k-fips mod_wsgi/3.4 Python/2.7.5 (internal dummy connection)"

logs/subnode-2/var/log/containers/httpd/aodh-api/aodh_wsgi_access.log.txt.gz
0.000 | 0218: 192.168.24.3 - - [08/Nov/2017:21:13:59 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.000 | 0219: 192.168.24.3 - - [08/Nov/2017:21:14:01 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.000 | 0220: 192.168.24.3 - - [08/Nov/2017:21:14:03 +0000] "OPTIONS / HTTP/1.0" 200 465 "-" "-"
0.417 | 0221: 192.168.24.3 - - [08/Nov/2017:21:14:04 +0000] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.4.6 (CentOS) OpenSSL/1.0.2k-fips mod_wsgi/3.4 Python/2.7.5 (internal dummy connection)"
0.000 | 0222: 192.168.24.3 - - [08/Nov/2017:21:14:05 +0000] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.4.6 (CentOS) OpenSSL/1.0.2k-fips mod_wsgi/3.4 Python/2.7.5 (internal dummy connection)"

logs/undercloud/var/log/journal.txt.gz
0.000 | 3930: Nov 08 19:37:26 centos-7-rax-iad-0000787514 sshd[21041]: Connection reset by 103.89.90.28 port 63145 [preauth]
0.156 | 3931: Nov 08 19:37:32 centos-7-rax-iad-0000787514 unix_chkpwd[21090]: password check failed for user (sync)
0.054 | 3932: Nov 08 19:37:32 centos-7-rax-iad-0000787514 sshd[21056]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.89.90.28 user=sync
0.251 | 3933: Nov 08 19:37:34 centos-7-rax-iad-0000787514 sshd[21056]: Failed password for sync from 103.89.90.28 port 63162 ssh2
0.000 | 3934: Nov 08 19:37:35 centos-7-rax-iad-0000787514 sshd[21056]: Connection reset by 103.89.90.28 port 63162 [preauth]

0.000 | 4011: Nov 08 19:39:48 centos-7-rax-iad-0000787514 sshd[21123]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.89.90.28
0.000 | 4012: Nov 08 19:39:50 centos-7-rax-iad-0000787514 sshd[21123]: Failed password for invalid user test from 103.89.90.28 port 63350 ssh2
0.000 | 4013: Nov 08 19:39:53 centos-7-rax-iad-0000787514 sshd[21123]: Connection reset by 103.89.90.28 port 63350 [preauth]
0.229 | 4014: Nov 08 19:40:05 centos-7-rax-iad-0000787514 unix_chkpwd[21127]: password check failed for user (nobody)
0.095 | 4015: Nov 08 19:40:05 centos-7-rax-iad-0000787514 sshd[21125]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.89.90.28 user=nobody
0.345 | 4016: Nov 08 19:40:06 centos-7-rax-iad-0000787514 sshd[21125]: Failed password for nobody from 103.89.90.28 port 63044 ssh2
0.000 | 4017: Nov 08 19:40:07 centos-7-rax-iad-0000787514 sshd[21125]: Connection reset by 103.89.90.28 port 63044 [preauth]

0.000 | 37267: Nov 08 20:39:49 centos-7-rax-iad-0000787514 unix_chkpwd[14785]: password check failed for user (root)
0.000 | 37268: Nov 08 20:39:49 centos-7-rax-iad-0000787514 sshd[14783]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=serversiro.tk user=root
0.000 | 37269: Nov 08 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Failed password for root from 94.177.218.11 port 37659 ssh2
0.330 | 37270: Nov 08 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Received disconnect from 94.177.218.11 port 37659:11: Normal Shutdown, Thank you for playing [preauth]
0.000 | 37271: Nov 08 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Disconnected from 94.177.218.11 port 37659 [preauth]

0.000 | 39049: Nov 08 21:04:41 centos-7-rax-iad-0000787514 object-server[6469]: Starting object reconstruction pass.
0.000 | 39050: Nov 08 21:04:41 centos-7-rax-iad-0000787514 object-server[6469]: Nothing reconstructed for 0.000617980957031 seconds.
0.000 | 39051: Nov 08 21:04:41 centos-7-rax-iad-0000787514 object-server[6469]: Object reconstruction complete. (0.00 minutes)
0.207 | 39052: Nov 08 21:04:41 centos-7-rax-iad-0000787514 unix_chkpwd[17652]: password check failed for user (jenkins)
0.082 | 39053: Nov 08 21:04:41 centos-7-rax-iad-0000787514 sshd[17650]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=serversiro.tk user=jenkins
0.317 | 39054: Nov 08 21:04:42 centos-7-rax-iad-0000787514 sshd[17650]: Failed password for jenkins from 94.177.218.11 port 33376 ssh2
0.000 | 39055: Nov 08 21:04:43 centos-7-rax-iad-0000787514 sshd[17650]: Received disconnect from 94.177.218.11 port 33376:11: Normal Shutdown, Thank you for playing [preauth]

0.000 | 40750: Nov 08 21:25:06 centos-7-rax-iad-0000787514 systemd[1]: Started Session 328 of user zuul.
0.000 | 40751: Nov 08 21:25:06 centos-7-rax-iad-0000787514 sshd[19562]: pam_unix(sshd:session): session opened for user zuul by (uid=0)
0.000 | 40752: Nov 08 21:25:06 centos-7-rax-iad-0000787514 systemd[1]: Starting Session 328 of user zuul.
0.209 | 40753: Nov 08 21:25:06 centos-7-rax-iad-0000787514 ansible-command[19576]: Invoked with warn=True executable=None _uses_shell=True _raw_params=tail -10 tempest_output.log; exit 1 removes=None creates=None chdir=None
0.000 | 40754: Nov 08 21:25:06 centos-7-rax-iad-0000787514 sshd[19565]: Received disconnect from 127.0.0.1 port 56374:11: disconnected by user

logs/subnode-2/var/log/dmesg.txt.gz
0.000 | 0589: [ 4.581114] EDAC sbridge: Ver: 1.1.1
0.000 | 0590: [ 7.610812] type=1305 audit(1510168497.428:4): audit_pid=479 old=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:auditd_t:s0 res=1
0.000 | 0591: [ 7.872036] ISO 9660 Extensions: Microsoft Joliet Level 3
0.388 | 0592: [ 7.873165] nf_conntrack version 0.5.0 (65536 buckets, 262144 max)
0.000 | 0593: [ 7.881455] ISO 9660 Extensions: RRIP_1991A

logs/undercloud/home/zuul/overcloud_deploy.log.txt.gz
0.000 | 0465: 2017-11-08 20:44:30 | 2017-11-08 20:36:34Z [overcloud.AllNodesDeploySteps.ControllerArtifactsDeploy]: CREATE_COMPLETE Stack CREATE completed successfully
0.000 | 0466: 2017-11-08 20:44:30 | 2017-11-08 20:36:35Z [overcloud.AllNodesDeploySteps.ControllerArtifactsDeploy]: CREATE_COMPLETE state changed
0.119 | 0467: 2017-11-08 20:44:30 | 2017-11-08 20:37:15Z [0]: SIGNAL_IN_PROGRESS Signal: deployment 17846122-46f7-4695-8228-f341c4e17eae succeeded
0.266 | 0468: 2017-11-08 20:44:30 | 2017-11-08 20:37:16Z [0]: CREATE_COMPLETE state changed
0.000 | 0469: 2017-11-08 20:44:30 | 2017-11-08 20:37:17Z [overcloud.AllNodesDeploySteps.ControllerHostPrepDeployment]: CREATE_COMPLETE Stack CREATE completed successfully

logs/subnode-2/rpm-qa.txt.gz
0.000 | 0017: bind-utils-9.9.4-51.el7.x86_64
0.000 | 0018: binutils-2.25.1-32.base.el7_4.1.x86_64
0.000 | 0019: boost-iostreams-1.53.0-27.el7.x86_64
0.367 | 0020: boost-program-options-1.53.0-27.el7.x86_64
0.000 | 0021: boost-random-1.53.0-27.el7.x86_64
0.367 | 0022: boost-regex-1.53.0-27.el7.x86_64
0.000 | 0023: boost-system-1.53.0-27.el7.x86_64

0.000 | 0029: centos-release-qemu-ev-1.0-2.el7.noarch
0.000 | 0030: centos-release-storage-common-1-2.el7.centos.noarch
0.000 | 0031: centos-release-virt-common-1-1.el7.centos.noarch
0.263 | 0032: ceph-common-10.2.7-0.el7.x86_64
0.000 | 0033: checkpolicy-2.5-4.el7.x86_64

0.000 | 0138: glibc-common-2.17-196.el7.x86_64
0.000 | 0139: glibc-devel-2.17-196.el7.x86_64
0.000 | 0140: glibc-headers-2.17-196.el7.x86_64
0.201 | 0141: glusterfs-3.8.4-18.4.el7.centos.x86_64
0.000 | 0142: glusterfs-api-3.8.4-18.4.el7.centos.x86_64
0.273 | 0143: glusterfs-client-xlators-3.8.4-18.4.el7.centos.x86_64
0.291 | 0144: glusterfs-libs-3.8.4-18.4.el7.centos.x86_64
0.000 | 0145: gmp-6.0.0-15.el7.x86_64

0.000 | 0322: openssh-server-7.4p1-13.el7_4.x86_64
0.000 | 0323: openssl-1.0.2k-8.el7.x86_64
0.000 | 0324: openssl-libs-1.0.2k-8.el7.x86_64
0.267 | 0325: openstack-cinder-12.0.0-0.20171107135501.fb27334.el7.centos.noarch
0.000 | 0326: openstack-selinux-0.8.11-0.20171013194326.ce13ba7.el7.centos.noarch

0.000 | 0627: python-netaddr-0.7.18-1.el7.noarch
0.000 | 0628: python-netifaces-0.10.4-3.el7.x86_64
0.000 | 0629: python-networkx-1.10-1.el7.noarch
0.258 | 0630: python-networkx-core-1.10-1.el7.noarch
0.000 | 0631: python-nose-1.3.7-7.el7.noarch

logs/subnode-2/var/log/extra/yum-list-installed.txt.gz
0.000 | 0023: bind-utils.x86_64 32:9.9.4-51.el7 @updates
0.000 | 0024: binutils.x86_64 2.25.1-32.base.el7_4.1 @updates
0.000 | 0025: boost-iostreams.x86_64 1.53.0-27.el7 @quickstart-centos-base
0.310 | 0026: boost-program-options.x86_64 1.53.0-27.el7 @quickstart-centos-base
0.000 | 0027: boost-random.x86_64 1.53.0-27.el7 @quickstart-centos-base
0.310 | 0028: boost-regex.x86_64 1.53.0-27.el7 @quickstart-centos-base
0.000 | 0029: boost-system.x86_64 1.53.0-27.el7 @quickstart-centos-base

0.000 | 0648: python-kazoo.noarch 2.2.1-1.el7 @delorean-queens-deps
0.000 | 0649: python-keyring.noarch 5.7.1-1.el7 @delorean-queens-testing
0.000 | 0650: python-kitchen.noarch 1.1.1-5.el7 @base
0.259 | 0651: python-kmod.x86_64 0.9-4.el7 @quickstart-centos-base
0.000 | 0652: python-libs.x86_64 2.7.5-58.el7 @base

logs/undercloud/var/log/neutron/dhcp-agent.log.txt.gz
0.000 | 0511: 2017-11-08 20:25:54.737 1876 DEBUG neutron.agent.linux.dhcp [req-42ac3418-7957-46e0-a067-f8d2504ef915 413e6e63e2364d448b63d2aa8a4751a9 1aee66fe88ea43ad9892206dfd8ab59e - - -] Building host file: /var/lib/neutron/dhcp/35ab1a0b-e5c9-4f4a-b561-f643af1fffc8/host _output_hosts_file /usr/lib/python2.7/site-packages/neutron/agent/linux/dhcp.py:689
0.000 | 0512: 2017-11-08 20:25:54.739 1876 DEBUG neutron.agent.linux.dhcp [req-42ac3418-7957-46e0-a067-f8d2504ef915 413e6e63e2364d448b63d2aa8a4751a9 1aee66fe88ea43ad9892206dfd8ab59e - - -] Done building host file /var/lib/neutron/dhcp/35ab1a0b-e5c9-4f4a-b561-f643af1fffc8/host _output_hosts_file /usr/lib/python2.7/site-packages/neutron/agent/linux/dhcp.py:728
0.000 | 0513: 2017-11-08 20:25:54.741 1876 DEBUG neutron.agent.linux.utils [req-42ac3418-7957-46e0-a067-f8d2504ef915 413e6e63e2364d448b63d2aa8a4751a9 1aee66fe88ea43ad9892206dfd8ab59e - - -] Running command (rootwrap daemon): ['kill', '-HUP', '7464'] execute_rootwrap_daemon /usr/lib/python2.7/site-packages/neutron/agent/linux/utils.py:108
0.357 | 0514: 2017-11-08 20:25:54.743 1876 WARNING oslo_rootwrap.client [req-42ac3418-7957-46e0-a067-f8d2504ef915 413e6e63e2364d448b63d2aa8a4751a9 1aee66fe88ea43ad9892206dfd8ab59e - - -] Leaving behind already spawned process with pid 7401, root should kill it if it's still there (I can't): error: [Errno 2] ENOENT
0.000 | 0515: 2017-11-08 20:25:54.754 1876 DEBUG oslo_rootwrap.client [req-42ac3418-7957-46e0-a067-f8d2504ef915 413e6e63e2364d448b63d2aa8a4751a9 1aee66fe88ea43ad9892206dfd8ab59e - - -] Popen for ['sudo', 'neutron-rootwrap-daemon', '/etc/neutron/rootwrap.conf'] command has been instantiated _initialize /usr/lib/python2.7/site-packages/oslo_rootwrap/client.py:68

logs/undercloud/var/log/neutron/openvswitch-agent.log.txt.gz
0.000 | 1551: 2017-11-08 20:07:24.193 2014 DEBUG oslo_concurrency.lockutils [req-f9e29852-5a2a-47db-bd23-42cb5968ea5f - - - - -] Acquired semaphore "iptables" lock /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:212
0.000 | 1552: 2017-11-08 20:07:24.194 2014 DEBUG oslo_concurrency.lockutils [req-f9e29852-5a2a-47db-bd23-42cb5968ea5f - - - - -] Acquired external semaphore "iptables" lock /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:219
0.000 | 1553: 2017-11-08 20:07:24.194 2014 DEBUG neutron.agent.linux.utils [req-f9e29852-5a2a-47db-bd23-42cb5968ea5f - - - - -] Running command (rootwrap daemon): ['iptables-save'] execute_rootwrap_daemon /usr/lib/python2.7/site-packages/neutron/agent/linux/utils.py:108
0.347 | 1554: 2017-11-08 20:07:24.195 2014 WARNING oslo_rootwrap.client [req-f9e29852-5a2a-47db-bd23-42cb5968ea5f - - - - -] Leaving behind already spawned process with pid 2106, root should kill it if it's still there (I can't): error: [Errno 32] Broken pipe
0.000 | 1555: 2017-11-08 20:07:24.204 2014 DEBUG oslo_rootwrap.client [req-f9e29852-5a2a-47db-bd23-42cb5968ea5f - - - - -] Popen for ['sudo', 'neutron-rootwrap-daemon', '/etc/neutron/rootwrap.conf'] command has been instantiated _initialize /usr/lib/python2.7/site-packages/oslo_rootwrap/client.py:68

logs/subnode-2/var/log/extra/docker/containers/neutron_l3_agent/docker_info.log.txt.gz
0.000 | 0012: KiB Swap: 7999020 total, 7840888 free, 158132 used. 1333420 avail Mem
0.000 | 0013:
0.000 | 0014: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
0.316 | 0015: 206945 root 20 0 112312 17220 4264 R 94.1 0.2 0:00.26 ceilometer-root
0.000 | 0016: 112150 ceilome+ 20 0 4948064 93892 2772 S 11.8 1.1 1:44.88 ceilometer-agen

logs/subnode-2/var/log/extra/docker/containers/neutron_metadata_agent/docker_info.log.txt.gz
0.000 | 0010: KiB Swap: 7999020 total, 7837892 free, 161128 used. 1338396 avail Mem
0.000 | 0011:
0.000 | 0012: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
0.315 | 0013: 208017 root 20 0 140044 22120 4516 S 88.9 0.3 0:00.38 ceilometer-root
0.000 | 0014: 69 root 20 0 0 0 0 S 5.6 0.0 0:13.50 kswapd0

logs/subnode-2/syslog.txt.gz
0.000 | 1354: Nov 08 21:13:47 centos-7-rax-iad-0000787869 kernel: device qr-7544bf4c-7f entered promiscuous mode
0.000 | 1355: Nov 08 21:13:47 centos-7-rax-iad-0000787869 sudo[147923]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool sdr info
0.000 | 1356: Nov 08 21:13:48 centos-7-rax-iad-0000787869 sudo[148034]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.211 | 1357: Nov 08 21:13:49 centos-7-rax-iad-0000787869 sudo[148202]: cinder : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/bin/cinder-rootwrap /etc/cinder/rootwrap.conf env LC_ALL=C qemu-img info /var/lib/cinder/conversion/tmpP_BqpGhostgroup@tripleo_ceph
0.000 | 1358: Nov 08 21:13:49 centos-7-rax-iad-0000787869 sudo[148207]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00

0.000 | 2429: Nov 08 21:27:15 centos-7-rax-iad-0000787869 sudo[213253]: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/sbin/pcs stonith show --full
0.000 | 2430: Nov 08 21:27:16 centos-7-rax-iad-0000787869 sudo[213256]: ceilometer : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf ipmitool raw 0x0a 0x2c 0x00
0.000 | 2431: Nov 08 21:27:16 centos-7-rax-iad-0000787869 sudo[213289]: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/sbin/crm_verify -L -VVVVVV
0.385 | 2432: Nov 08 21:27:16 centos-7-rax-iad-0000787869 sudo[213296]: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/ceph status
0.000 | 2433: Nov 08 21:27:16 centos-7-rax-iad-0000787869 sudo[213324]: zuul : TTY=unknown ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/facter

logs/undercloud/var/log/messages.txt.gz
0.000 | 34253: Nov 8 20:36:39 centos-7-rax-iad-0000787514 object-server: Starting object reconstruction pass.
0.000 | 34254: Nov 8 20:36:39 centos-7-rax-iad-0000787514 object-server: Nothing reconstructed for 0.000438928604126 seconds.
0.000 | 34255: Nov 8 20:36:39 centos-7-rax-iad-0000787514 object-server: Object reconstruction complete. (0.00 minutes)
0.363 | 34256: Nov 8 20:36:41 centos-7-rax-iad-0000787514 sshd[14710]: Connection reset by 103.79.143.32 port 52117 [preauth]
0.000 | 34257: Nov 8 20:36:41 centos-7-rax-iad-0000787514 kernel: iptables dropped: IN=eth0 OUT= MAC=bc:76:4e:20:34:1a:e8:ed:f3:3a:52:41:08:00 SRC=117.58.241.70 DST=104.130.226.81 LEN=48 TOS=0x00 PREC=0x00 TTL=108 ID=21893 DF PROTO=TCP SPT=64604 DPT=445 WINDOW=8192 RES=0x00 SYN URGP=0

0.000 | 34370: Nov 8 20:39:39 centos-7-rax-iad-0000787514 object-server: Nothing reconstructed for 0.000584125518799 seconds.
0.000 | 34371: Nov 8 20:39:39 centos-7-rax-iad-0000787514 object-server: Object reconstruction complete. (0.00 minutes)
0.000 | 34372: Nov 8 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Failed password for root from 94.177.218.11 port 37659 ssh2
0.288 | 34373: Nov 8 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Received disconnect from 94.177.218.11 port 37659:11: Normal Shutdown, Thank you for playing [preauth]
0.000 | 34374: Nov 8 20:39:50 centos-7-rax-iad-0000787514 sshd[14783]: Disconnected from 94.177.218.11 port 37659 [preauth]

0.000 | 36049: Nov 8 21:04:41 centos-7-rax-iad-0000787514 object-server: Starting object reconstruction pass.
0.000 | 36050: Nov 8 21:04:41 centos-7-rax-iad-0000787514 object-server: Nothing reconstructed for 0.000617980957031 seconds.
0.000 | 36051: Nov 8 21:04:41 centos-7-rax-iad-0000787514 object-server: Object reconstruction complete. (0.00 minutes)
0.297 | 36052: Nov 8 21:04:42 centos-7-rax-iad-0000787514 sshd[17650]: Failed password for jenkins from 94.177.218.11 port 33376 ssh2
0.000 | 36053: Nov 8 21:04:43 centos-7-rax-iad-0000787514 sshd[17650]: Received disconnect from 94.177.218.11 port 33376:11: Normal Shutdown, Thank you for playing [preauth]

0.000 | 37589: Nov 8 21:25:06 centos-7-rax-iad-0000787514 systemd-logind: New session 328 of user zuul.
0.000 | 37590: Nov 8 21:25:06 centos-7-rax-iad-0000787514 systemd: Started Session 328 of user zuul.
0.000 | 37591: Nov 8 21:25:06 centos-7-rax-iad-0000787514 systemd: Starting Session 328 of user zuul.
0.213 | 37592: Nov 8 21:25:06 centos-7-rax-iad-0000787514 ansible-command: Invoked with warn=True executable=None _uses_shell=True _raw_params=tail -10 tempest_output.log; exit 1 removes=None creates=None chdir=None
0.000 | 37593: Nov 8 21:25:06 centos-7-rax-iad-0000787514 sshd[19565]: Received disconnect from 127.0.0.1 port 56374:11: disconnected by user

0.000 | 38247: Nov 8 21:29:23 centos-7-rax-iad-0000787514 ironic-inspector: 2017-11-08 21:29:23.791 5682 DEBUG ironic_inspector.pxe_filter.base [-] The PXE filter driver IptablesFilter, state=initialized enters the fsm_reset_on_error context fsm_reset_on_error /usr/lib/python2.7/site-packages/ironic_inspector/pxe_filter/base.py:137
0.000 | 38248: Nov 8 21:29:23 centos-7-rax-iad-0000787514 ironic-inspector: 2017-11-08 21:29:23.796 5682 DEBUG ironic_inspector.pxe_filter.iptables [-] DHCP is already disabled, not updating _disable_dhcp /usr/lib/python2.7/site-packages/ironic_inspector/pxe_filter/iptables.py:176
0.000 | 38249: Nov 8 21:29:23 centos-7-rax-iad-0000787514 ironic-inspector: 2017-11-08 21:29:23.797 5682 DEBUG ironic_inspector.pxe_filter.base [-] The PXE filter driver IptablesFilter, state=initialized left the fsm_reset_on_error context fsm_reset_on_error /usr/lib/python2.7/site-packages/ironic_inspector/pxe_filter/base.py:153
0.236 | 38250: Nov 8 21:29:28 centos-7-rax-iad-0000787514 sshd[21589]: Failed password for apache from 94.177.218.11 port 47387 ssh2
0.000 | 38251: Nov 8 21:29:28 centos-7-rax-iad-0000787514 sshd[21589]: Received disconnect from 94.177.218.11 port 47387:11: Normal Shutdown, Thank you for playing [preauth]

logs/subnode-2/var/log/extra/docker/containers/neutron_dhcp/docker_info.log.txt.gz
0.000 | 0488: 208234 root 20 0 376736 9976 5244 S 0.0 0.1 0:00.01 docker-current
0.000 | 0489: 208245 root 20 0 339288 1852 1340 S 0.0 0.0 0:00.00 docker-containe
0.000 | 0490: 208269 root 20 0 347484 1836 1276 S 0.0 0.0 0:00.00 docker-containe
0.278 | 0491: 208279 root 20 0 196744 4320 3128 S 0.0 0.1 0:00.00 docker-runc-cur
0.000 | 0492: + docker inspect neutron_dhcp

logs/undercloud/var/log/heat/heat-engine.log.txt.gz
0.000 | 1492: 2017-11-08 20:25:34.600 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::ControllerConfig -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/controller-config.yaml
0.069 | 1493: 2017-11-08 20:25:34.601 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CinderBackup -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/pacemaker/cinder-backup.yaml
0.000 | 1494: 2017-11-08 20:25:34.601 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::Apache -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/apache.yaml
0.209 | 1495: 2017-11-08 20:25:34.602 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CeilometerExpirer -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/disabled/ceilometer-expirer-disabled.yaml
0.000 | 1496: 2017-11-08 20:25:34.602 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/common/services.yaml

0.000 | 1587: 2017-11-08 20:25:34.639 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::NeutronApi -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/docker/services/neutron-api.yaml
0.000 | 1588: 2017-11-08 20:25:34.639 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::NeutronL3Agent -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/docker/services/neutron-l3.yaml
0.000 | 1589: 2017-11-08 20:25:34.640 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Controller::Net::SoftwareConfig -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/ci/common/net-config-multinode.yaml
0.209 | 1590: 2017-11-08 20:25:34.640 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CeilometerApi -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/disabled/ceilometer-api-disabled.yaml
0.000 | 1591: 2017-11-08 20:25:34.641 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::SwiftProxy -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/docker/services/swift-proxy.yaml

0.000 | 1619: 2017-11-08 20:25:34.653 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::NovaVncProxy -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/docker/services/nova-vnc-proxy.yaml
0.000 | 1620: 2017-11-08 20:25:34.653 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Controller::NodeUserData -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/firstboot/userdata_default.yaml
0.000 | 1621: 2017-11-08 20:25:34.654 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CephClient -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/docker/services/ceph-ansible/ceph-client.yaml
0.209 | 1622: 2017-11-08 20:25:34.654 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CeilometerCollector -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/disabled/ceilometer-collector-disabled.yaml
0.074 | 1623: 2017-11-08 20:25:34.654 2405 INFO heat.engine.environment [req-6256e079-833f-44c5-b9e5-3099837d3c30 admin admin - default default] overcloud Registered: [Template](User:True) OS::TripleO::Services::CinderVolume -> http://192.168.24.1:8080/v1/AUTH_1aee66fe88ea43ad9892206dfd8ab59e/overcloud/puppet/services/pacemaker/cinder-volume.yaml

logs/log-size.txt.gz
0.000 | 0188: 124K /home/zuul/workspace/logs/delorean_logs/a7/7d/a77d26728da9220ffdb5b95f75ac5cd6cb56b0b0_dev
0.000 | 0189: 116K /home/zuul/workspace/logs/undercloud/etc/selinux/targeted/active/modules/100/base
0.000 | 0190: 116K /home/zuul/workspace/logs/subnode-2/etc/selinux/targeted/active/modules/100/base
0.223 | 0191: 116K /home/zuul/workspace/logs/docs/build/.doctrees
0.000 | 0192: 116K /home/zuul/workspace/logs/ara_oooq/reports

logs/subnode-2/var/log/pacemaker/bundles/rabbitmq-bundle-0/rabbitmq/rabbit@centos-7-rax-iad-0000787869.log.txt.gz
0.000 | 0544: accepting AMQP connection <0.12701.0> (192.168.24.3:59026 -> 192.168.24.15:5672)
0.000 | 0545:
0.000 | 0546: =INFO REPORT==== 8-Nov-2017::21:08:51 ===
0.219 | 0547: Connection <0.12701.0> (192.168.24.3:59026 -> 192.168.24.15:5672) has a client-provided name: cinder-backup:124342:7dc58edd-5119-41ca-b2d7-444308b63a8c
0.000 | 0548:

logs/devstack-gate-cleanup-host.txt
0.000 | 0806: 2017-11-08 21:32:33.764 | + /home/zuul/workspace/devstack-gate/functions.sh:save_file:L544: [[ -z old/tempest.log ]]
0.000 | 0807: 2017-11-08 21:32:33.767 | + /home/zuul/workspace/devstack-gate/functions.sh:save_file:L551: [[ -f /opt/stack/old/tempest/tempest.log ]]
0.000 | 0808: 2017-11-08 21:32:33.770 | + /home/zuul/workspace/devstack-gate/functions.sh:cleanup_host:L782: '[' -d /var/log/ceph ']'
0.213 | 0809: 2017-11-08 21:32:33.774 | + /home/zuul/workspace/devstack-gate/functions.sh:cleanup_host:L783: sudo cp -r /var/log/ceph /opt/stack/logs/
0.000 | 0810: 2017-11-08 21:32:33.778 | + /home/zuul/workspace/devstack-gate/functions.sh:cleanup_host:L785: save_file /etc/ceph/ceph.conf


Unmatched file in previous success logs