report.html

Report generated on 31-Oct-2020 at 01:14:09 by pytest-html v2.1.1

Environment

389-ds-base 2.0.0.0-20201031gitcdaa81c.fc32
Packages {"pluggy": "0.13.1", "py": "1.9.0", "pytest": "5.4.3"}
Platform Linux-5.7.7-200.fc32.x86_64-x86_64-with-glibc2.2.5
Plugins {"html": "2.1.1", "libfaketime": "0.1.2", "metadata": "1.10.0"}
Python 3.8.6
cyrus-sasl 2.1.27-4.fc32
nspr 4.29.0-1.fc32
nss 3.57.0-1.fc32
openldap 2.4.47-5.fc32

Summary

2058 tests ran in 17860.54 seconds.

1968 passed, 21 skipped, 62 failed, 11 errors, 20 expected failures, 8 unexpected passes

Results

Result Test Duration Links
Error suites/replication/cleanallruv_test.py::test_clean_restart::teardown 129.89
def fin():
try:
# Restart the masters and rerun cleanallruv
for inst in topology_m4.ms.values():
inst.restart()

cruv_task = CleanAllRUVTask(topology_m4.ms["master1"])
cruv_task.create(properties={
'replica-id': m4rid,
'replica-base-dn': DEFAULT_SUFFIX,
'replica-force-cleaning': 'no',
})
cruv_task.wait()
except ldap.UNWILLING_TO_PERFORM:
# In some casse we already cleaned rid4, so if we fail, it's okay
pass
restore_master4(topology_m4)
# Make sure everything works.
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:179:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c33baa60>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91517cc5-1389-40b8-88af-eac4acee701e / got description=6d98fd23-b029-4cf7-8af3-09045d403f8e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect d7dfc440-5cba-400a-baa6-d62d20360992 / got description=91517cc5-1389-40b8-88af-eac4acee701e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect fc7f5abf-263d-477b-a236-df919dbbbd59 / got description=d7dfc440-5cba-400a-baa6-d62d20360992) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 372422fe-c134-4c3c-8705-175a01634b94 / got description=fc7f5abf-263d-477b-a236-df919dbbbd59) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect cb394ea4-8f02-4f24-ae77-f1b194014754 / got description=372422fe-c134-4c3c-8705-175a01634b94) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 02865025-4885-4426-b358-1e7e0e27a0da / got description=cb394ea4-8f02-4f24-ae77-f1b194014754) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect bcf89d0c-87de-43fa-aa3c-c0ce8cbbcaf6 / got description=02865025-4885-4426-b358-1e7e0e27a0da) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 88c8ceb7-8d27-4975-9e03-d162125bb04a / got description=7122f6e1-0858-4ef6-8914-3a5ec860abf9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6855ea6f-d6f4-44af-a217-b472a9d43cff / got description=88c8ceb7-8d27-4975-9e03-d162125bb04a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c76fa1b1-ad36-4f92-af89-07f2e5fcc370 / got description=6855ea6f-d6f4-44af-a217-b472a9d43cff) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 8590b280-2a37-4696-9200-2904fe082e4d / got description=c76fa1b1-ad36-4f92-af89-07f2e5fcc370) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:181 -------------- FINISH RESET of m4 -----------------
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:256 Running test_clean_restart... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:259 test_clean: disable master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:71 test_clean: remove all the agreements to master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:267 test_clean: run the cleanAllRUV task... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:292 test_clean_restart: check all the masters have been cleaned... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:1 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:2 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:3 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:296 test_clean_restart PASSED, restoring master 4...
-----------------------------Captured log teardown------------------------------
INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect f81d3861-9742-49d5-bcad-ebbf818829b8 / got description=8590b280-2a37-4696-9200-2904fe082e4d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 07b8080a-d4cf-4886-a8e5-187a5f947882 / got description=f81d3861-9742-49d5-bcad-ebbf818829b8) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 07b8080a-d4cf-4886-a8e5-187a5f947882 / got description=f81d3861-9742-49d5-bcad-ebbf818829b8) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 07b8080a-d4cf-4886-a8e5-187a5f947882 / got description=f81d3861-9742-49d5-bcad-ebbf818829b8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:148 Master 4 has been successfully restored. INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a6a03463-12e2-4255-976f-82e8fd123a21 / got description=f81d3861-9742-49d5-bcad-ebbf818829b8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect ba956b87-611e-41a9-8cb1-15b3dfab2e06 / got description=a6a03463-12e2-4255-976f-82e8fd123a21) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect a57e26e0-909b-4c87-bb18-3713e10b26e9 / got description=ba956b87-611e-41a9-8cb1-15b3dfab2e06) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 37dddf1e-abce-4436-8c75-11fb1fd51402 / got description=a57e26e0-909b-4c87-bb18-3713e10b26e9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect c1e28090-b576-431f-bb07-4a479870ce82 / got description=37dddf1e-abce-4436-8c75-11fb1fd51402) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 2a250435-96d1-4335-86e4-a5271980d1c7 / got description=c1e28090-b576-431f-bb07-4a479870ce82) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 29f6e719-c58d-4c70-9f52-7931a52c473f / got description=2a250435-96d1-4335-86e4-a5271980d1c7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0321a86f-6e48-4231-a690-05d5b741bc2b / got description=29f6e719-c58d-4c70-9f52-7931a52c473f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect d2d998b3-180e-43c4-b92d-cb19ea408323 / got description=0321a86f-6e48-4231-a690-05d5b741bc2b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 18845081-24ff-421f-9611-8627e5119a80 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323)
Error suites/replication/cleanallruv_test.py::test_clean_force::setup 34.45
request = <SubRequest 'm4rid' for <Function test_clean_force>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c3443d00>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5c1c52de-2f74-4f07-90c6-68cb76cc0e18 / got description=d2d998b3-180e-43c4-b92d-cb19ea408323) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect e589923e-57e2-4660-ada5-c07e12e003bb / got description=5c1c52de-2f74-4f07-90c6-68cb76cc0e18) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 3c4bc354-579d-437e-8918-376b61ae43a2 / got description=e589923e-57e2-4660-ada5-c07e12e003bb) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ef6df2a4-25ea-4cc0-b4c3-e27e5e11db5a / got description=3c4bc354-579d-437e-8918-376b61ae43a2) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect c99400dc-be92-4ece-96dc-56c743c0856a / got description=ef6df2a4-25ea-4cc0-b4c3-e27e5e11db5a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 3cbc0740-440e-4303-9c43-cf8646911ae7 / got description=c99400dc-be92-4ece-96dc-56c743c0856a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0da93b2b-2c2d-49d2-b1fb-cb4705aea145 / got description=3cbc0740-440e-4303-9c43-cf8646911ae7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3ee23333-aca0-4810-ab72-db2dea40d941 / got description=0da93b2b-2c2d-49d2-b1fb-cb4705aea145) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 3b015067-066f-426a-88cb-96ec70ff30bd / got description=3ee23333-aca0-4810-ab72-db2dea40d941) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d883a309-407a-4218-978e-3218d2c67987 / got description=3b015067-066f-426a-88cb-96ec70ff30bd)
Error suites/replication/cleanallruv_test.py::test_abort::setup 34.37
request = <SubRequest 'm4rid' for <Function test_abort>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c34d82e0>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5f80fa0e-9299-4e6a-9d85-98335ac1311f / got description=3b015067-066f-426a-88cb-96ec70ff30bd) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f4c73853-72ee-48be-aa01-bc5fa53aaee5 / got description=5f80fa0e-9299-4e6a-9d85-98335ac1311f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 39885e74-87f3-49d0-ad16-ba5fb974334d / got description=f4c73853-72ee-48be-aa01-bc5fa53aaee5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect dff44408-79db-4580-bd67-4590b651e596 / got description=39885e74-87f3-49d0-ad16-ba5fb974334d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect de80f882-c2b4-45b8-b246-77057d2b6cb8 / got description=dff44408-79db-4580-bd67-4590b651e596) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect e1524d5e-e071-452f-a1df-3195a8f3f390 / got description=de80f882-c2b4-45b8-b246-77057d2b6cb8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect fcdc9d0a-7120-4dea-9137-efae043f555e / got description=e1524d5e-e071-452f-a1df-3195a8f3f390) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect cb945856-6417-465a-b8be-6da91ec33b00 / got description=fcdc9d0a-7120-4dea-9137-efae043f555e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 550c3245-9522-49cb-a236-1da13581c0e7 / got description=cb945856-6417-465a-b8be-6da91ec33b00) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 89951600-d951-45f1-a737-b5c90d22b2ab / got description=550c3245-9522-49cb-a236-1da13581c0e7)
Error suites/replication/cleanallruv_test.py::test_abort_restart::setup 34.39
request = <SubRequest 'm4rid' for <Function test_abort_restart>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c33bc730>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4bb16770-28d0-423f-9b05-c7a41913757c / got description=550c3245-9522-49cb-a236-1da13581c0e7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 60558c5b-58e4-48bc-8c50-9c86b2721e65 / got description=4bb16770-28d0-423f-9b05-c7a41913757c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect cedbcb00-5d65-47fc-9c1c-e0630815c6d6 / got description=60558c5b-58e4-48bc-8c50-9c86b2721e65) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect c982518d-6e28-4d16-9030-ea059a9c907b / got description=cedbcb00-5d65-47fc-9c1c-e0630815c6d6) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f3af31db-b1ad-4329-958f-5d6811bd98a7 / got description=c982518d-6e28-4d16-9030-ea059a9c907b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect e9eb867c-e8bf-4e5c-bc37-b08726b70396 / got description=f3af31db-b1ad-4329-958f-5d6811bd98a7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 91445b86-c443-48cd-a52b-a553c40a2f20 / got description=e9eb867c-e8bf-4e5c-bc37-b08726b70396) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7b093249-dcb2-4832-b27e-fcdde08eae69 / got description=91445b86-c443-48cd-a52b-a553c40a2f20) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 4047eea7-9963-4679-b8b1-5047b9560a75 / got description=7b093249-dcb2-4832-b27e-fcdde08eae69) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3deea14a-fda7-4ca1-8a96-72bd1e8c8f99 / got description=4047eea7-9963-4679-b8b1-5047b9560a75)
Error suites/replication/cleanallruv_test.py::test_abort_certify::setup 34.42
request = <SubRequest 'm4rid' for <Function test_abort_certify>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c3319340>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ce796e76-38a4-47be-a943-b9d7ca717e5d / got description=4047eea7-9963-4679-b8b1-5047b9560a75) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 690eafd8-e83e-43aa-af6c-4369efe32c5d / got description=ce796e76-38a4-47be-a943-b9d7ca717e5d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect edae28a8-5561-4725-91a7-2b7b6f54e645 / got description=690eafd8-e83e-43aa-af6c-4369efe32c5d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3d1d02a1-759e-4265-b526-7dbefcec2ebe / got description=edae28a8-5561-4725-91a7-2b7b6f54e645) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f64c9a84-9a92-4976-9ff2-9dbef6d9267c / got description=3d1d02a1-759e-4265-b526-7dbefcec2ebe) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 86223196-8c74-4a47-af5d-8818b539addf / got description=f64c9a84-9a92-4976-9ff2-9dbef6d9267c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2df1be45-b319-48bf-bf6a-f5ad17d8de94 / got description=86223196-8c74-4a47-af5d-8818b539addf) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9df1534a-7309-4dd7-a60e-5dcb88b09ba9 / got description=2df1be45-b319-48bf-bf6a-f5ad17d8de94) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 4c162ff2-1aab-4ab8-bb75-34b7b1ab0003 / got description=9df1534a-7309-4dd7-a60e-5dcb88b09ba9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8296b0bf-cace-4a37-b9b9-d83432a17e45 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003)
Error suites/replication/cleanallruv_test.py::test_stress_clean::setup 34.46
request = <SubRequest 'm4rid' for <Function test_stress_clean>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c35eb580>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d94acbab-535a-48df-93a9-f31a01d94965 / got description=4c162ff2-1aab-4ab8-bb75-34b7b1ab0003) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect b8e8e19c-8c69-44fd-a3a4-d662de6dd618 / got description=d94acbab-535a-48df-93a9-f31a01d94965) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 2f75686c-d982-473b-ab7a-dab89d8ebfa6 / got description=b8e8e19c-8c69-44fd-a3a4-d662de6dd618) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ac6912c8-d1d4-4304-9f1e-c6ec3badd164 / got description=2f75686c-d982-473b-ab7a-dab89d8ebfa6) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 91514488-ca17-4126-b21a-801e136eb886 / got description=ac6912c8-d1d4-4304-9f1e-c6ec3badd164) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 8bd8a5f9-7596-46f1-8b6d-ecf7a31f936f / got description=91514488-ca17-4126-b21a-801e136eb886) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b8bde283-b8e7-41c5-acd3-af546e2ded9b / got description=8bd8a5f9-7596-46f1-8b6d-ecf7a31f936f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect da499a95-2e30-4ba3-94e4-876027ebc477 / got description=b8bde283-b8e7-41c5-acd3-af546e2ded9b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect b4d9435d-f820-4d8a-b6aa-53061a36a15e / got description=da499a95-2e30-4ba3-94e4-876027ebc477) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 714d6400-cec3-40df-b93c-c426f376fc37 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e)
Error suites/replication/cleanallruv_test.py::test_multiple_tasks_with_force::setup 34.37
request = <SubRequest 'm4rid' for <Function test_multiple_tasks_with_force>>
topology_m4 = <lib389.topologies.TopologyMain object at 0x7f61c3381160>

@pytest.fixture()
def m4rid(request, topology_m4):
log.debug("Wait a bit before the reset - it is required for the slow machines")
time.sleep(5)
log.debug("-------------- BEGIN RESET of m4 -----------------")
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl.test_replication_topology(topology_m4.ms.values())

suites/replication/cleanallruv_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2531: in test_replication_topology
self.test_replication(a, b, timeout)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:2517: in test_replication
self.wait_for_replication(from_instance, to_instance, timeout)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.replica.ReplicationManager object at 0x7f61c3627400>
from_instance = <lib389.DirSrv object at 0x7f61c33ec940>
to_instance = <lib389.DirSrv object at 0x7f61c3395640>, timeout = 20

def wait_for_replication(self, from_instance, to_instance, timeout=20):
"""Wait for a replication event to occur from instance to instance. This
shows some point of synchronisation has occured.

:param from_instance: The instance whos state we we want to check from
:type from_instance: lib389.DirSrv
:param to_instance: The instance whos state we want to check matches from.
:type to_instance: lib389.DirSrv
:param timeout: Fail after timeout seconds.
:type timeout: int

"""
# Touch something then wait_for_replication.
from_groups = Groups(from_instance, basedn=self._suffix, rdn=None)
to_groups = Groups(to_instance, basedn=self._suffix, rdn=None)
from_group = from_groups.get('replication_managers')
to_group = to_groups.get('replication_managers')

change = str(uuid.uuid4())

from_group.replace('description', change)

for i in range(0, timeout):
desc = to_group.get_attr_val_utf8('description')
if change == desc:
self._log.info("SUCCESS: Replication from %s to %s is working" % (from_instance.ldapuri, to_instance.ldapuri))
return True
self._log.info("Retry: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
time.sleep(1)
self._log.info("FAIL: Replication from %s to %s is NOT working (expect %s / got description=%s)" % (from_instance.ldapuri, to_instance.ldapuri, change, desc))
> raise Exception("Replication did not sync in time!")
E Exception: Replication did not sync in time!

/usr/local/lib/python3.8/site-packages/lib389/replica.py:2501: Exception
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect faa45c72-ab46-469f-80e2-fb2b6187ab83 / got description=b4d9435d-f820-4d8a-b6aa-53061a36a15e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 9b5f4d8b-6723-4c22-9811-093dbce1f205 / got description=faa45c72-ab46-469f-80e2-fb2b6187ab83) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 018fdced-bb9b-4200-834a-1c76b9842b5f / got description=9b5f4d8b-6723-4c22-9811-093dbce1f205) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 649ade7a-8a10-47c0-b47b-6e9dee3b4efd / got description=018fdced-bb9b-4200-834a-1c76b9842b5f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect ea4327c6-fba5-4c60-b906-fef593764077 / got description=649ade7a-8a10-47c0-b47b-6e9dee3b4efd) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect f2dcc85f-4c78-445a-abc3-4e0687f79202 / got description=ea4327c6-fba5-4c60-b906-fef593764077) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d94df254-0546-4b5c-a11d-2d1a9eb9bb89 / got description=f2dcc85f-4c78-445a-abc3-4e0687f79202) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 44933e07-5fe4-4b1a-9db2-ac1a1d5db039 / got description=d94df254-0546-4b5c-a11d-2d1a9eb9bb89) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 1e079897-22f0-4c7f-b98a-6f4531ebdbcd / got description=44933e07-5fe4-4b1a-9db2-ac1a1d5db039) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 723be627-c308-4510-b12b-1ecc56d26d60 / got description=1e079897-22f0-4c7f-b98a-6f4531ebdbcd)
Error tickets/ticket48973_test.py::test_ticket48973_init::setup 1.39
request = <SubRequest 'topology' for <Function test_ticket48973_init>>

@pytest.fixture(scope="module")
def topology(request):
# Creating standalone instance ...
standalone = DirSrv(verbose=False)
args_instance[SER_HOST] = HOST_STANDALONE
args_instance[SER_PORT] = PORT_STANDALONE
args_instance[SER_SERVERID_PROP] = SERVERID_STANDALONE
args_instance[SER_CREATION_SUFFIX] = DEFAULT_SUFFIX
args_standalone = args_instance.copy()
standalone.allocate(args_standalone)
instance_standalone = standalone.exists()
if instance_standalone:
standalone.delete()
> standalone.create()

/export/tests/tickets/ticket48973_test.py:52:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:838: in create
self._createDirsrv(version)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:808: in _createDirsrv
sds.create_from_args(general, slapd, backends, None)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:663: in create_from_args
self._prepare_ds(general, slapd, backends)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:594: in _prepare_ds
assert_c(slapd['root_dn'] is not None, "Configuration root_dn in section [slapd] not found")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

condition = False, msg = 'Configuration root_dn in section [slapd] not found'

def assert_c(condition, msg="Assertion Failed"):
"""This is the same as assert, but assert is compiled out
when optimisation is enabled. This prevents compiling out.
"""
if not condition:
> raise AssertionError(msg)
E AssertionError: Configuration root_dn in section [slapd] not found

/usr/local/lib/python3.8/site-packages/lib389/utils.py:1243: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation...
Error tickets/ticket48973_test.py::test_ticket48973_ces_not_indexed::setup 0.00
request = <SubRequest 'topology' for <Function test_ticket48973_init>>

@pytest.fixture(scope="module")
def topology(request):
# Creating standalone instance ...
standalone = DirSrv(verbose=False)
args_instance[SER_HOST] = HOST_STANDALONE
args_instance[SER_PORT] = PORT_STANDALONE
args_instance[SER_SERVERID_PROP] = SERVERID_STANDALONE
args_instance[SER_CREATION_SUFFIX] = DEFAULT_SUFFIX
args_standalone = args_instance.copy()
standalone.allocate(args_standalone)
instance_standalone = standalone.exists()
if instance_standalone:
standalone.delete()
> standalone.create()

/export/tests/tickets/ticket48973_test.py:52:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:838: in create
self._createDirsrv(version)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:808: in _createDirsrv
sds.create_from_args(general, slapd, backends, None)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:663: in create_from_args
self._prepare_ds(general, slapd, backends)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:594: in _prepare_ds
assert_c(slapd['root_dn'] is not None, "Configuration root_dn in section [slapd] not found")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

condition = False, msg = 'Configuration root_dn in section [slapd] not found'

def assert_c(condition, msg="Assertion Failed"):
"""This is the same as assert, but assert is compiled out
when optimisation is enabled. This prevents compiling out.
"""
if not condition:
> raise AssertionError(msg)
E AssertionError: Configuration root_dn in section [slapd] not found

/usr/local/lib/python3.8/site-packages/lib389/utils.py:1243: AssertionError
Error tickets/ticket48973_test.py::test_ticket48973_homeDirectory_indexing::setup 0.00
request = <SubRequest 'topology' for <Function test_ticket48973_init>>

@pytest.fixture(scope="module")
def topology(request):
# Creating standalone instance ...
standalone = DirSrv(verbose=False)
args_instance[SER_HOST] = HOST_STANDALONE
args_instance[SER_PORT] = PORT_STANDALONE
args_instance[SER_SERVERID_PROP] = SERVERID_STANDALONE
args_instance[SER_CREATION_SUFFIX] = DEFAULT_SUFFIX
args_standalone = args_instance.copy()
standalone.allocate(args_standalone)
instance_standalone = standalone.exists()
if instance_standalone:
standalone.delete()
> standalone.create()

/export/tests/tickets/ticket48973_test.py:52:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:838: in create
self._createDirsrv(version)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:808: in _createDirsrv
sds.create_from_args(general, slapd, backends, None)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:663: in create_from_args
self._prepare_ds(general, slapd, backends)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:594: in _prepare_ds
assert_c(slapd['root_dn'] is not None, "Configuration root_dn in section [slapd] not found")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

condition = False, msg = 'Configuration root_dn in section [slapd] not found'

def assert_c(condition, msg="Assertion Failed"):
"""This is the same as assert, but assert is compiled out
when optimisation is enabled. This prevents compiling out.
"""
if not condition:
> raise AssertionError(msg)
E AssertionError: Configuration root_dn in section [slapd] not found

/usr/local/lib/python3.8/site-packages/lib389/utils.py:1243: AssertionError
Error tickets/ticket48973_test.py::test_ticket48973_homeDirectory_caseExactIA5Match_caseIgnoreIA5Match_indexing::setup 0.00
request = <SubRequest 'topology' for <Function test_ticket48973_init>>

@pytest.fixture(scope="module")
def topology(request):
# Creating standalone instance ...
standalone = DirSrv(verbose=False)
args_instance[SER_HOST] = HOST_STANDALONE
args_instance[SER_PORT] = PORT_STANDALONE
args_instance[SER_SERVERID_PROP] = SERVERID_STANDALONE
args_instance[SER_CREATION_SUFFIX] = DEFAULT_SUFFIX
args_standalone = args_instance.copy()
standalone.allocate(args_standalone)
instance_standalone = standalone.exists()
if instance_standalone:
standalone.delete()
> standalone.create()

/export/tests/tickets/ticket48973_test.py:52:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:838: in create
self._createDirsrv(version)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:808: in _createDirsrv
sds.create_from_args(general, slapd, backends, None)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:663: in create_from_args
self._prepare_ds(general, slapd, backends)
/usr/local/lib/python3.8/site-packages/lib389/instance/setup.py:594: in _prepare_ds
assert_c(slapd['root_dn'] is not None, "Configuration root_dn in section [slapd] not found")
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

condition = False, msg = 'Configuration root_dn in section [slapd] not found'

def assert_c(condition, msg="Assertion Failed"):
"""This is the same as assert, but assert is compiled out
when optimisation is enabled. This prevents compiling out.
"""
if not condition:
> raise AssertionError(msg)
E AssertionError: Configuration root_dn in section [slapd] not found

/usr/local/lib/python3.8/site-packages/lib389/utils.py:1243: AssertionError
Failed suites/acl/keywords_part2_test.py::test_access_from_certain_network_only_ip 3.80
topo = <lib389.topologies.TopologyMain object at 0x7f61d5300b50>
add_user = None, aci_of_user = None

def test_access_from_certain_network_only_ip(topo, add_user, aci_of_user):
"""
User can access the data when connecting from certain network only as per the ACI.

:id: 4ec38296-7ac5-11e8-9816-8c16451d917b
:setup: Standalone Server
:steps:
1. Add test entry
2. Add ACI
3. User should follow ACI role
:expectedresults:
1. Entry should be added
2. Operation should succeed
3. Operation should succeed
"""
# Turn access log buffering off to make less time consuming
topo.standalone.config.set('nsslapd-accesslog-logbuffering', 'off')

# Find the ip from ds logs , as we need to know the exact ip used by ds to run the instances.
# Wait till Access Log is generated
topo.standalone.restart()

# Add ACI
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
domain.add("aci", f'(target = "ldap:///{IP_OU_KEY}")(targetattr=\"*\")(version 3.0; aci "IP aci"; '
f'allow(all)userdn = "ldap:///{NETSCAPEIP_KEY}" and ip = "::1" ;)')

# create a new connection for the test
conn = UserAccount(topo.standalone, NETSCAPEIP_KEY).bind(PW_DM)
# Perform Operation
org = OrganizationalUnit(conn, IP_OU_KEY)
> org.replace("seeAlso", "cn=1")

suites/acl/keywords_part2_test.py:76:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:280: in replace
self.set(key, value, action=ldap.MOD_REPLACE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61d6cb30a0>
func = <built-in method result4 of LDAP object at 0x7f61d537ec00>
args = (3, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 3, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'seeAlso' attribute of entry 'ou=ip,ou=keywords,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed suites/acl/keywords_part2_test.py::test_connectin_from_an_unauthorized_network 0.09
topo = <lib389.topologies.TopologyMain object at 0x7f61d5300b50>
add_user = None, aci_of_user = None

def test_connectin_from_an_unauthorized_network(topo, add_user, aci_of_user):
"""
User cannot access the data when connectin from an unauthorized network as per the ACI.

:id: 52d1ecce-7ac5-11e8-9ad9-8c16451d917b
:setup: Standalone Server
:steps:
1. Add test entry
2. Add ACI
3. User should follow ACI role
:expectedresults:
1. Entry should be added
2. Operation should succeed
3. Operation should succeed
"""

# Add ACI
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
domain.add("aci", f'(target = "ldap:///{IP_OU_KEY}")'
f'(targetattr="*")(version 3.0; aci "IP aci"; '
f'allow(all) userdn = "ldap:///{NETSCAPEIP_KEY}" '
f'and ip != "::1" ;)')

# create a new connection for the test
conn = UserAccount(topo.standalone, NETSCAPEIP_KEY).bind(PW_DM)
# Perform Operation
org = OrganizationalUnit(conn, IP_OU_KEY)
with pytest.raises(ldap.INSUFFICIENT_ACCESS):
> org.replace("seeAlso", "cn=1")
E Failed: DID NOT RAISE <class 'ldap.INSUFFICIENT_ACCESS'>

suites/acl/keywords_part2_test.py:119: Failed
Failed suites/clu/repl_monitor_test.py::test_dsconf_replication_monitor 0.30
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61d3cf95b0>
set_log_file = None

@pytest.mark.ds50545
@pytest.mark.bz1739718
@pytest.mark.skipif(ds_is_older("1.4.0"), reason="Not implemented")
def test_dsconf_replication_monitor(topology_m2, set_log_file):
"""Test replication monitor that was ported from legacy tools

:id: ce48020d-7c30-41b7-8f68-144c9cd757f6
:setup: 2 MM topology
:steps:
1. Create DS instance
2. Run replication monitor with connections option
3. Run replication monitor with aliases option
4. Run replication monitor with --json option
5. Run replication monitor with .dsrc file created
:expectedresults:
1. Success
2. Success
3. Success
4. Success
5. Success
"""

m1 = topology_m2.ms["master1"]
m2 = topology_m2.ms["master2"]

alias_content = ['Supplier: M1 (' + m1.host + ':' + str(m1.port) + ')',
'Supplier: M2 (' + m2.host + ':' + str(m2.port) + ')']

connection_content = 'Supplier: '+ m1.host + ':' + str(m1.port)
content_list = ['Replica Root: dc=example,dc=com',
'Replica ID: 1',
'Replica Status: Available',
'Max CSN',
'Status For Agreement: "002" ('+ m2.host + ':' + str(m2.port) + ')',
'Replica Enabled: on',
'Update In Progress: FALSE',
'Last Update Start:',
'Last Update End:',
'Number Of Changes Sent:',
'Number Of Changes Skipped: None',
'Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded',
'Last Init Start:',
'Last Init End:',
'Last Init Status:',
'Reap Active: 0',
'Replication Status: In Synchronization',
'Replication Lag Time:',
'Supplier: ',
m2.host + ':' + str(m2.port),
'Replica Root: dc=example,dc=com',
'Replica ID: 2',
'Status For Agreement: "001" (' + m1.host + ':' + str(m1.port)+')']

json_list = ['type',
'list',
'items',
'name',
m1.host + ':' + str(m1.port),
'data',
'"replica_id": "1"',
'"replica_root": "dc=example,dc=com"',
'"replica_status": "Available"',
'maxcsn',
'agmts_status',
'agmt-name',
'002',
'replica',
m2.host + ':' + str(m2.port),
'replica-enabled',
'update-in-progress',
'last-update-start',
'last-update-end',
'number-changes-sent',
'number-changes-skipped',
'last-update-status',
'Error (0) Replica acquired successfully: Incremental update succeeded',
'last-init-start',
'last-init-end',
'last-init-status',
'reap-active',
'replication-status',
'In Synchronization',
'replication-lag-time',
'"replica_id": "2"',
'001',
m1.host + ':' + str(m1.port)]

dsrc_content = '[repl-monitor-connections]\n' \
'connection1 = ' + m1.host + ':' + str(m1.port) + ':' + DN_DM + ':' + PW_DM + '\n' \
'connection2 = ' + m2.host + ':' + str(m2.port) + ':' + DN_DM + ':' + PW_DM + '\n' \
'\n' \
'[repl-monitor-aliases]\n' \
'M1 = ' + m1.host + ':' + str(m1.port) + '\n' \
'M2 = ' + m2.host + ':' + str(m2.port)

connections = [m1.host + ':' + str(m1.port) + ':' + DN_DM + ':' + PW_DM,
m2.host + ':' + str(m2.port) + ':' + DN_DM + ':' + PW_DM]

aliases = ['M1=' + m1.host + ':' + str(m1.port),
'M2=' + m2.host + ':' + str(m2.port)]

args = FakeArgs()
args.connections = connections
args.aliases = None
args.json = False

log.info('Run replication monitor with connections option')
get_repl_monitor_info(m1, DEFAULT_SUFFIX, log, args)
check_value_in_log_and_reset(content_list, connection_content)

log.info('Run replication monitor with aliases option')
args.aliases = aliases
get_repl_monitor_info(m1, DEFAULT_SUFFIX, log, args)
> check_value_in_log_and_reset(content_list, alias_content)

suites/clu/repl_monitor_test.py:177:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

content_list = ['Replica Root: dc=example,dc=com', 'Replica ID: 1', 'Replica Status: Available', 'Max CSN', 'Status For Agreement: "002" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002)', 'Replica Enabled: on', ...]
second_list = ['Supplier: M1 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001)', 'Supplier: M2 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002)']
single_value = None

def check_value_in_log_and_reset(content_list, second_list=None, single_value=None):
with open(LOG_FILE, 'r+') as f:
file_content = f.read()

for item in content_list:
log.info('Check that "{}" is present'.format(item))
assert item in file_content

if second_list is not None:
log.info('Check for "{}"'.format(second_list))
for item in second_list:
> assert item in file_content
E AssertionError: assert 'Supplier: M1 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001)' in 'Run replication monitor with aliases option\ndsrc path: /root/.dsrc\ndsrc container path: /data/config/container.inf\...t Init Status: unavailable\nReap Active: 0\nReplication Status: In Synchronization\nReplication Lag Time: 00:00:00\n\n'

suites/clu/repl_monitor_test.py:54: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dc405130-f6e2-4b75-8f11-97072db44a96 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 4e92e0ea-263e-4cbf-b04d-8b6f87bb8f04 / got description=dc405130-f6e2-4b75-8f11-97072db44a96) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:170 Run replication monitor with connections option DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:76 dsrc path: /root/.dsrc DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:77 dsrc container path: /data/config/container.inf DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:85 dsrc instances: [] DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:210 dsrc completed with {'connections': None, 'aliases': None} INFO  tests.suites.clu.repl_monitor_test:replication.py:438 Supplier: localhost.localdomain:39001 INFO  tests.suites.clu.repl_monitor_test:replication.py:443 ------------------------------------- INFO  tests.suites.clu.repl_monitor_test:replication.py:455 Replica Root: dc=example,dc=com INFO  tests.suites.clu.repl_monitor_test:replication.py:456 Replica ID: 1 INFO  tests.suites.clu.repl_monitor_test:replication.py:457 Replica Status: Available INFO  tests.suites.clu.repl_monitor_test:replication.py:458 Max CSN: 5f9cb294000000010000 INFO  tests.suites.clu.repl_monitor_test:replication.py:461 Status For Agreement: "002" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002) Replica Enabled: on Update In Progress: FALSE Last Update Start: 20201031004053Z Last Update End: 20201031004053Z Number Of Changes Sent: 1:2/0 Number Of Changes Skipped: None Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded Last Init Start: 19700101000000Z Last Init End: 19700101000000Z Last Init Status: unavailable Reap Active: 0 Replication Status: In Synchronization Replication Lag Time: 00:00:00 INFO  tests.suites.clu.repl_monitor_test:replication.py:438 Supplier: ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  tests.suites.clu.repl_monitor_test:replication.py:443 ----------------------------------------------------------------- INFO  tests.suites.clu.repl_monitor_test:replication.py:455 Replica Root: dc=example,dc=com INFO  tests.suites.clu.repl_monitor_test:replication.py:456 Replica ID: 2 INFO  tests.suites.clu.repl_monitor_test:replication.py:457 Replica Status: Available INFO  tests.suites.clu.repl_monitor_test:replication.py:458 Max CSN: 5f9cb295000000020000 INFO  tests.suites.clu.repl_monitor_test:replication.py:461 Status For Agreement: "001" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001) Replica Enabled: on Update In Progress: FALSE Last Update Start: 20201031004053Z Last Update End: 20201031004053Z Number Of Changes Sent: 2:1/0 Number Of Changes Skipped: None Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded Last Init Start: 19700101000000Z Last Init End: 19700101000000Z Last Init Status: unavailable Reap Active: 0 Replication Status: In Synchronization Replication Lag Time: 00:00:00 INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Root: dc=example,dc=com" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica ID: 1" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Status: Available" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Max CSN" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Status For Agreement: "002" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002)" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Enabled: on" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Update In Progress: FALSE" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update Start:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update End:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Number Of Changes Sent:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Number Of Changes Skipped: None" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init Start:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init End:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init Status:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Reap Active: 0" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replication Status: In Synchronization" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replication Lag Time:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Supplier: " is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Root: dc=example,dc=com" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica ID: 2" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Status For Agreement: "001" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001)" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:52 Check for "Supplier: ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001" INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:60 Reset log file INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:174 Run replication monitor with aliases option DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:76 dsrc path: /root/.dsrc DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:77 dsrc container path: /data/config/container.inf DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:85 dsrc instances: [] DEBUG  tests.suites.clu.repl_monitor_test:dsrc.py:210 dsrc completed with {'connections': None, 'aliases': None} INFO  tests.suites.clu.repl_monitor_test:replication.py:438 Supplier: localhost.localdomain:39001 INFO  tests.suites.clu.repl_monitor_test:replication.py:443 ------------------------------------- INFO  tests.suites.clu.repl_monitor_test:replication.py:455 Replica Root: dc=example,dc=com INFO  tests.suites.clu.repl_monitor_test:replication.py:456 Replica ID: 1 INFO  tests.suites.clu.repl_monitor_test:replication.py:457 Replica Status: Available INFO  tests.suites.clu.repl_monitor_test:replication.py:458 Max CSN: 5f9cb294000000010000 INFO  tests.suites.clu.repl_monitor_test:replication.py:461 Status For Agreement: "002" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002) Replica Enabled: on Update In Progress: FALSE Last Update Start: 20201031004053Z Last Update End: 20201031004053Z Number Of Changes Sent: 1:2/0 Number Of Changes Skipped: None Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded Last Init Start: 19700101000000Z Last Init End: 19700101000000Z Last Init Status: unavailable Reap Active: 0 Replication Status: In Synchronization Replication Lag Time: 00:00:00 INFO  tests.suites.clu.repl_monitor_test:replication.py:438 Supplier: M2 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002) INFO  tests.suites.clu.repl_monitor_test:replication.py:443 ---------------------------------------------------------------------- INFO  tests.suites.clu.repl_monitor_test:replication.py:455 Replica Root: dc=example,dc=com INFO  tests.suites.clu.repl_monitor_test:replication.py:456 Replica ID: 2 INFO  tests.suites.clu.repl_monitor_test:replication.py:457 Replica Status: Available INFO  tests.suites.clu.repl_monitor_test:replication.py:458 Max CSN: 5f9cb295000000020000 INFO  tests.suites.clu.repl_monitor_test:replication.py:461 Status For Agreement: "001" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001) Replica Enabled: on Update In Progress: FALSE Last Update Start: 20201031004053Z Last Update End: 20201031004053Z Number Of Changes Sent: 2:1/0 Number Of Changes Skipped: None Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded Last Init Start: 19700101000000Z Last Init End: 19700101000000Z Last Init Status: unavailable Reap Active: 0 Replication Status: In Synchronization Replication Lag Time: 00:00:00 INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Root: dc=example,dc=com" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica ID: 1" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Status: Available" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Max CSN" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Status For Agreement: "002" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002)" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Enabled: on" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Update In Progress: FALSE" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update Start:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update End:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Number Of Changes Sent:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Number Of Changes Skipped: None" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Update Status: Error (0) Replica acquired successfully: Incremental update succeeded" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init Start:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init End:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Last Init Status:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Reap Active: 0" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replication Status: In Synchronization" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replication Lag Time:" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Supplier: " is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica Root: dc=example,dc=com" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Replica ID: 2" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:48 Check that "Status For Agreement: "001" (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001)" is present INFO  tests.suites.clu.repl_monitor_test:repl_monitor_test.py:52 Check for "['Supplier: M1 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001)', 'Supplier: M2 (ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002)']"
Failed suites/gssapi/simple_gssapi_test.py::test_gssapi_bind 0.28
topology_st_gssapi = <lib389.topologies.TopologyMain object at 0x7f61d07a45e0>
testuser = <lib389.idm.user.UserAccount object at 0x7f61d07a4340>

@gssapi_ack
def test_gssapi_bind(topology_st_gssapi, testuser):
"""Test that we can bind with GSSAPI

:id: 894a4c27-3d4c-4ba3-aa33-2910032e3783

:setup: standalone gssapi instance

:steps:
1. Bind with sasl/gssapi
:expectedresults:
1. Bind succeeds

"""
> conn = testuser.bind_gssapi()

suites/gssapi/simple_gssapi_test.py:53:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/idm/account.py:258: in bind_gssapi
inst_clone.open(saslmethod='gssapi')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:995: in open
self.sasl_interactive_bind_s("", sasl_auth, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:476: in sasl_interactive_bind_s
return self._ldap_call(self._l.sasl_interactive_bind_s,who,auth,RequestControlTuples(serverctrls),RequestControlTuples(clientctrls),sasl_flags)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61d0a76c10>
func = <built-in method sasl_interactive_bind_s of LDAP object at 0x7f61c3f4bdb0>
args = ('', <ldap.sasl.gssapi object at 0x7f61c3f4baf0>, None, None, 2)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INVALID_CREDENTIALS: {'result': 49, 'desc': 'Invalid credentials', 'ctrls': [], 'info': 'SASL(-1): generic failure: GSSAPI Error: An invalid name was supplied (Included profile file could not be read)'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INVALID_CREDENTIALS
-----------------------------Captured stdout setup------------------------------
Kerberos master password: sYtlIl3tRayMqDm2HhLDKI4IfxCJbf.e38Y9eiuV0gmnJ2tmMdeqPxaKNEBtAxhdE Loading random data Initializing database '/var/kerberos/krb5kdc/principal' for realm 'HOSTED.UPSHIFT.RDU2.REDHAT.COM', master key name 'K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM' Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/etc/krb5.keytab. Entry for principal ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/etc/krb5.keytab. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal root/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/testuser.keytab. Entry for principal testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/testuser.keytab.
-----------------------------Captured stderr setup------------------------------
No policy specified for ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy No policy specified for testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed suites/gssapi/simple_gssapi_test.py::test_support_mech 0.28
topology_st_gssapi = <lib389.topologies.TopologyMain object at 0x7f61d07a45e0>
testuser = <lib389.idm.user.UserAccount object at 0x7f61d07a4340>

@gssapi_ack
def test_support_mech(topology_st_gssapi, testuser):
"""Test allowed sasl mechs works when GSSAPI is allowed

:id: 6ec80aca-00c4-4141-b96b-3ae8837fc751

:setup: standalone gssapi instance

:steps:
1. Add GSSAPI to allowed sasl mechanisms.
2. Attempt to bind
:expectedresults:
1. The allowed mechs are changed.
2. The bind succeeds.
"""
topology_st_gssapi.standalone.config.set('nsslapd-allowed-sasl-mechanisms', 'GSSAPI EXTERNAL ANONYMOUS')
> conn = testuser.bind_gssapi()

suites/gssapi/simple_gssapi_test.py:125:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/idm/account.py:258: in bind_gssapi
inst_clone.open(saslmethod='gssapi')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:995: in open
self.sasl_interactive_bind_s("", sasl_auth, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:476: in sasl_interactive_bind_s
return self._ldap_call(self._l.sasl_interactive_bind_s,who,auth,RequestControlTuples(serverctrls),RequestControlTuples(clientctrls),sasl_flags)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61d08fa2e0>
func = <built-in method sasl_interactive_bind_s of LDAP object at 0x7f61d17580c0>
args = ('', <ldap.sasl.gssapi object at 0x7f61d174ed00>, None, None, 2)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INVALID_CREDENTIALS: {'result': 49, 'desc': 'Invalid credentials', 'ctrls': [], 'info': 'SASL(-1): generic failure: GSSAPI Error: An invalid name was supplied (Included profile file could not be read)'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INVALID_CREDENTIALS
Failed suites/healthcheck/health_security_test.py::test_healthcheck_certif_expiring_within_30d 11.09
topology_st = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>

@pytest.mark.ds50873
@pytest.mark.bz1685160
@pytest.mark.xfail(ds_is_older("1.4.1"), reason="Not implemented")
def test_healthcheck_certif_expiring_within_30d(topology_st):
"""Check if HealthCheck returns DSCERTLE0001 code

:id: c2165032-88ba-4978-a4ca-2fecfd8c35d8
:setup: Standalone instance
:steps:
1. Create DS instance
2. Use libfaketime to tell the process the date is within 30 days before certificate expiration
3. Use HealthCheck without --json option
4. Use HealthCheck with --json option
:expectedresults:
1. Success
2. Success
3. Healthcheck reports DSCERTLE0001 code and related details
4. Healthcheck reports DSCERTLE0001 code and related details
"""

RET_CODE = 'DSCERTLE0001'

standalone = topology_st.standalone

standalone.enable_tls()

# Cert is valid two years from today, so we count the date that is within 30 days before certificate expiration
date_future = datetime.now() + timedelta(days=701)

with libfaketime.fake_time(date_future):
time.sleep(1)
> run_healthcheck_and_flush_log(topology_st, standalone, RET_CODE, json=False)

suites/healthcheck/health_security_test.py:304:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>
instance = <lib389.DirSrv object at 0x7f61d07cd700>
searched_code = 'DSCERTLE0001', json = False, searched_code2 = None

def run_healthcheck_and_flush_log(topology, instance, searched_code, json, searched_code2=None):
args = FakeArgs()
args.instance = instance.serverid
args.verbose = instance.verbose
args.list_errors = False
args.list_checks = False
args.check = ['config', 'encryption', 'tls', 'fschecks']
args.dry_run = False

if json:
log.info('Use healthcheck with --json option')
args.json = json
health_check_run(instance, topology.logcap.log, args)
assert topology.logcap.contains(searched_code)
log.info('Healthcheck returned searched code: %s' % searched_code)

if searched_code2 is not None:
assert topology.logcap.contains(searched_code2)
log.info('Healthcheck returned searched code: %s' % searched_code2)
else:
log.info('Use healthcheck without --json option')
args.json = json
health_check_run(instance, topology.logcap.log, args)
> assert topology.logcap.contains(searched_code)
E AssertionError: assert False
E + where False = <bound method LogCapture.contains of <LogCapture (NOTSET)>>('DSCERTLE0001')
E + where <bound method LogCapture.contains of <LogCapture (NOTSET)>> = <LogCapture (NOTSET)>.contains
E + where <LogCapture (NOTSET)> = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>.logcap

suites/healthcheck/health_security_test.py:67: AssertionError
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found.
Failed suites/healthcheck/health_security_test.py::test_healthcheck_certif_expired 10.70
topology_st = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>

@pytest.mark.ds50873
@pytest.mark.bz1685160
@pytest.mark.xfail(ds_is_older("1.4.1"), reason="Not implemented")
def test_healthcheck_certif_expired(topology_st):
"""Check if HealthCheck returns DSCERTLE0002 code

:id: ceff2c22-62c0-4fd9-b737-930a88458d68
:setup: Standalone instance
:steps:
1. Create DS instance
2. Use libfaketime to tell the process the date is after certificate expiration
3. Use HealthCheck without --json option
4. Use HealthCheck with --json option
:expectedresults:
1. Success
2. Success
3. Healthcheck reports DSCERTLE0002 code and related details
4. Healthcheck reports DSCERTLE0002 code and related details
"""

RET_CODE = 'DSCERTLE0002'

standalone = topology_st.standalone

standalone.enable_tls()

# Cert is valid two years from today, so we count the date that is after expiration
date_future = datetime.now() + timedelta(days=731)

with libfaketime.fake_time(date_future):
time.sleep(1)
> run_healthcheck_and_flush_log(topology_st, standalone, RET_CODE, json=False)

suites/healthcheck/health_security_test.py:343:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>
instance = <lib389.DirSrv object at 0x7f61d07cd700>
searched_code = 'DSCERTLE0002', json = False, searched_code2 = None

def run_healthcheck_and_flush_log(topology, instance, searched_code, json, searched_code2=None):
args = FakeArgs()
args.instance = instance.serverid
args.verbose = instance.verbose
args.list_errors = False
args.list_checks = False
args.check = ['config', 'encryption', 'tls', 'fschecks']
args.dry_run = False

if json:
log.info('Use healthcheck with --json option')
args.json = json
health_check_run(instance, topology.logcap.log, args)
assert topology.logcap.contains(searched_code)
log.info('Healthcheck returned searched code: %s' % searched_code)

if searched_code2 is not None:
assert topology.logcap.contains(searched_code2)
log.info('Healthcheck returned searched code: %s' % searched_code2)
else:
log.info('Use healthcheck without --json option')
args.json = json
health_check_run(instance, topology.logcap.log, args)
> assert topology.logcap.contains(searched_code)
E AssertionError: assert False
E + where False = <bound method LogCapture.contains of <LogCapture (NOTSET)>>('DSCERTLE0002')
E + where <bound method LogCapture.contains of <LogCapture (NOTSET)>> = <LogCapture (NOTSET)>.contains
E + where <LogCapture (NOTSET)> = <lib389.topologies.TopologyMain object at 0x7f61d06d7640>.logcap

suites/healthcheck/health_security_test.py:67: AssertionError
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 2 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSCERTLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: tls:certificate_expiration INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Expiring Certificate INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The certificate (Self-Signed-CA) will expire in less than 30 days INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Renew the certificate before it expires to prevent disruptions with TLS connections. INFO  LogCapture:health.py:45 [2] DS Lint Error: DSCERTLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: tls:certificate_expiration INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Expiring Certificate INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The certificate (Server-Cert) will expire in less than 30 days INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Renew the certificate before it expires to prevent disruptions with TLS connections. INFO  LogCapture:health.py:124 ===== End Of Report (2 Issues found) =====
Failed suites/import/import_test.py::test_fast_slow_import 10.47
topo = <lib389.topologies.TopologyMain object at 0x7f61c3b82e20>
_toggle_private_import_mem = None, _import_clean = None

def test_fast_slow_import(topo, _toggle_private_import_mem, _import_clean):
"""With nsslapd-db-private-import-mem: on is faster import.

:id: 3044331c-9c0e-11ea-ac9f-8c16451d917b
:setup: Standalone Instance
:steps:
1. Let's set nsslapd-db-private-import-mem:on, nsslapd-import-cache-autosize: 0
2. Measure offline import time duration total_time1
3. Now nsslapd-db-private-import-mem:off
4. Measure offline import time duration total_time2
5. total_time1 < total_time2
6. Set nsslapd-db-private-import-mem:on, nsslapd-import-cache-autosize: -1
7. Measure offline import time duration total_time1
8. Now nsslapd-db-private-import-mem:off
9. Measure offline import time duration total_time2
10. total_time1 < total_time2
:expected results:
1. Operation successful
2. Operation successful
3. Operation successful
4. Operation successful
5. Operation successful
6. Operation successful
7. Operation successful
8. Operation successful
9. Operation successful
10. Operation successful
"""
# Let's set nsslapd-db-private-import-mem:on, nsslapd-import-cache-autosize: 0
config = LDBMConfig(topo.standalone)
# Measure offline import time duration total_time1
total_time1 = _import_offline(topo, 20)
# Now nsslapd-db-private-import-mem:off
config.replace('nsslapd-db-private-import-mem', 'off')
accounts = Accounts(topo.standalone, DEFAULT_SUFFIX)
for i in accounts.filter('(uid=*)'):
UserAccount(topo.standalone, i.dn).delete()
# Measure offline import time duration total_time2
total_time2 = _import_offline(topo, 20)
# total_time1 < total_time2
> assert total_time1 < total_time2
E assert 2.094937801361084 < 2.0365939140319824

suites/import/import_test.py:307: AssertionError
Failed suites/paged_results/paged_results_test.py::test_search_paged_limits[conf_attr_values1-PASS] 5.57
topology_st = <lib389.topologies.TopologyMain object at 0x7f61d1361310>
create_user = <lib389.idm.user.UserAccount object at 0x7f61d0ef5970>
conf_attr_values = ('5000', '120', '122'), expected_rs = 'PASS'

@pytest.mark.parametrize('conf_attr_values,expected_rs',
((('5000', '100', '100'), ldap.ADMINLIMIT_EXCEEDED),
(('5000', '120', '122'), 'PASS')))
def test_search_paged_limits(topology_st, create_user, conf_attr_values, expected_rs):
"""Verify that nsslapd-idlistscanlimit and
nsslapd-lookthroughlimit can limit the administrator
search abilities.

:id: e0f8b916-7276-4bd3-9e73-8696a4468811
:parametrized: yes
:setup: Standalone instance, test user for binding,
10 users for the search base
:steps:
1. Set nsslapd-sizelimit and nsslapd-pagedsizelimit to 5000
2. Set nsslapd-idlistscanlimit: 120
3. Set nsslapd-lookthroughlimit: 122
4. Bind as test user
5. Search through added users with a simple paged control
using page_size = 10
6. Bind as Directory Manager
7. Set nsslapd-idlistscanlimit: 100
8. Set nsslapd-lookthroughlimit: 100
9. Bind as test user
10. Search through added users with a simple paged control
using page_size = 10
:expectedresults:
1. nsslapd-sizelimit and nsslapd-pagedsizelimit
should be successfully set
2. nsslapd-idlistscanlimit should be successfully set
3. nsslapd-lookthroughlimit should be successfully set
4. Bind should be successful
5. No error happens, all users should be found
6. Bind should be successful
7. nsslapd-idlistscanlimit should be successfully set
8. nsslapd-lookthroughlimit should be successfully set
9. Bind should be successful
10. It should throw ADMINLIMIT_EXCEEDED exception
"""

users_num = 101
page_size = 10
users_list = add_users(topology_st, users_num, DEFAULT_SUFFIX)
search_flt = r'(uid=test*)'
searchreq_attrlist = ['dn', 'sn']
size_attr_bck = change_conf_attr(topology_st, DN_CONFIG, 'nsslapd-sizelimit', conf_attr_values[0])
pagedsize_attr_bck = change_conf_attr(topology_st, DN_CONFIG, 'nsslapd-pagedsizelimit', conf_attr_values[0])
idlistscan_attr_bck = change_conf_attr(topology_st, 'cn=config,%s' % DN_LDBM, 'nsslapd-idlistscanlimit', conf_attr_values[1])
lookthrough_attr_bck = change_conf_attr(topology_st, 'cn=config,%s' % DN_LDBM, 'nsslapd-lookthroughlimit', conf_attr_values[2])

try:
log.info('Set user bind')
conn = create_user.bind(TEST_USER_PWD)

req_ctrl = SimplePagedResultsControl(True, size=page_size, cookie='')
controls = [req_ctrl]

if expected_rs == ldap.ADMINLIMIT_EXCEEDED:
log.info('Expect to fail with ADMINLIMIT_EXCEEDED')
with pytest.raises(expected_rs):
all_results = paged_search(conn, DEFAULT_SUFFIX, controls, search_flt, searchreq_attrlist)
elif expected_rs == 'PASS':
log.info('Expect to pass')
> all_results = paged_search(conn, DEFAULT_SUFFIX, controls, search_flt, searchreq_attrlist)

suites/paged_results/paged_results_test.py:901:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
suites/paged_results/paged_results_test.py:200: in paged_search
rtype, rdata, rmsgid, rctrls = conn.result3(msgid)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c36a66a0>
func = <built-in method result4 of LDAP object at 0x7f61c3725990>
args = (12, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.ADMINLIMIT_EXCEEDED: {'msgtype': 100, 'msgid': 12, 'result': 11, 'desc': 'Administrative limit exceeded', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: ADMINLIMIT_EXCEEDED
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 5000. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5000. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 120. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 122. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:889 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:900 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3725820>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'122'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'120'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Failed suites/paged_results/paged_results_test.py::test_search_paged_user_limits[conf_attr_values1-PASS] 4.83
topology_st = <lib389.topologies.TopologyMain object at 0x7f61d1361310>
create_user = <lib389.idm.user.UserAccount object at 0x7f61d0ef5970>
conf_attr_values = ('1000', '120', '122'), expected_rs = 'PASS'

@pytest.mark.parametrize('conf_attr_values,expected_rs',
((('1000', '100', '100'), ldap.ADMINLIMIT_EXCEEDED),
(('1000', '120', '122'), 'PASS')))
def test_search_paged_user_limits(topology_st, create_user, conf_attr_values, expected_rs):
"""Verify that nsPagedIDListScanLimit and nsPagedLookthroughLimit
override nsslapd-idlistscanlimit and nsslapd-lookthroughlimit
while performing search with the simple paged results control.

:id: 69e393e9-1ab8-4f4e-b4a1-06ca63dc7b1b
:parametrized: yes
:setup: Standalone instance, test user for binding,
10 users for the search base
:steps:
1. Set nsslapd-idlistscanlimit: 1000
2. Set nsslapd-lookthroughlimit: 1000
3. Set nsPagedIDListScanLimit: 120
4. Set nsPagedLookthroughLimit: 122
5. Bind as test user
6. Search through added users with a simple paged control
using page_size = 10
7. Bind as Directory Manager
8. Set nsPagedIDListScanLimit: 100
9. Set nsPagedLookthroughLimit: 100
10. Bind as test user
11. Search through added users with a simple paged control
using page_size = 10
:expectedresults:
1. nsslapd-idlistscanlimit should be successfully set
2. nsslapd-lookthroughlimit should be successfully set
3. nsPagedIDListScanLimit should be successfully set
4. nsPagedLookthroughLimit should be successfully set
5. Bind should be successful
6. No error happens, all users should be found
7. Bind should be successful
8. nsPagedIDListScanLimit should be successfully set
9. nsPagedLookthroughLimit should be successfully set
10. Bind should be successful
11. It should throw ADMINLIMIT_EXCEEDED exception
"""

users_num = 101
page_size = 10
users_list = add_users(topology_st, users_num, DEFAULT_SUFFIX)
search_flt = r'(uid=test*)'
searchreq_attrlist = ['dn', 'sn']
lookthrough_attr_bck = change_conf_attr(topology_st, 'cn=config,%s' % DN_LDBM, 'nsslapd-lookthroughlimit', conf_attr_values[0])
idlistscan_attr_bck = change_conf_attr(topology_st, 'cn=config,%s' % DN_LDBM, 'nsslapd-idlistscanlimit', conf_attr_values[0])
user_idlistscan_attr_bck = change_conf_attr(topology_st, create_user.dn, 'nsPagedIDListScanLimit', conf_attr_values[1])
user_lookthrough_attr_bck = change_conf_attr(topology_st, create_user.dn, 'nsPagedLookthroughLimit', conf_attr_values[2])

try:
log.info('Set user bind')
conn = create_user.bind(TEST_USER_PWD)

req_ctrl = SimplePagedResultsControl(True, size=page_size, cookie='')
controls = [req_ctrl]

if expected_rs == ldap.ADMINLIMIT_EXCEEDED:
log.info('Expect to fail with ADMINLIMIT_EXCEEDED')
with pytest.raises(expected_rs):
all_results = paged_search(conn, DEFAULT_SUFFIX, controls, search_flt, searchreq_attrlist)
elif expected_rs == 'PASS':
log.info('Expect to pass')
> all_results = paged_search(conn, DEFAULT_SUFFIX, controls, search_flt, searchreq_attrlist)

suites/paged_results/paged_results_test.py:975:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
suites/paged_results/paged_results_test.py:200: in paged_search
rtype, rdata, rmsgid, rctrls = conn.result3(msgid)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61d04f9490>
func = <built-in method result4 of LDAP object at 0x7f61c3717090>
args = (12, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.ADMINLIMIT_EXCEEDED: {'msgtype': 100, 'msgid': 12, 'result': 11, 'desc': 'Administrative limit exceeded', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: ADMINLIMIT_EXCEEDED
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 1000. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 1000. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to 120. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to 122. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:963 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:974 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3880370>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to None. Previous value - b'120'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to None. Previous value - b'122'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Failed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_complex_add_modify_modrdn_delete 88.36
self = <tests.suites.replication.conflict_resolve_test.TestTwoMasters object at 0x7f61c33aea00>
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2f65f40>
base_m2 = <lib389.idm.nscontainer.nsContainer object at 0x7f61c33ae760>

def test_complex_add_modify_modrdn_delete(self, topology_m2, base_m2):
"""Check that conflict properly resolved for complex operations
which involve add, modify, modrdn and delete

:id: 77f09b18-03d1-45da-940b-1ad2c2908eb1
:setup: Two master replication, test container for entries, enable plugin logging,
audit log, error log for replica and access log for internal
:steps:
1. Add ten users to m1 and wait for replication to happen
2. Pause replication
3. Test add-del on m1 and add on m2
4. Test add-mod on m1 and add on m2
5. Test add-modrdn on m1 and add on m2
6. Test multiple add, modrdn
7. Test Add-del on both masters
8. Test modrdn-modrdn
9. Test modrdn-del
10. Resume replication
11. Check that the entries on both masters are the same and replication is working
:expectedresults:
1. It should pass
2. It should pass
3. It should pass
4. It should pass
5. It should pass
6. It should pass
7. It should pass
8. It should pass
9. It should pass
10. It should pass
11. It should pass
"""

M1 = topology_m2.ms["master1"]
M2 = topology_m2.ms["master2"]

test_users_m1 = UserAccounts(M1, base_m2.dn, rdn=None)
test_users_m2 = UserAccounts(M2, base_m2.dn, rdn=None)
repl = ReplicationManager(SUFFIX)

for user_num in range(1100, 1110):
_create_user(test_users_m1, user_num)

repl.test_replication(M1, M2)
topology_m2.pause_all_replicas()

log.info("Test add-del on M1 and add on M2")
user_num += 1
_create_user(test_users_m1, user_num)
_delete_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)

user_num += 1
_create_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)
_delete_user(test_users_m1, user_num, sleep=True)

user_num += 1
_create_user(test_users_m2, user_num, sleep=True)
_create_user(test_users_m1, user_num)
_delete_user(test_users_m1, user_num)

log.info("Test add-mod on M1 and add on M2")
user_num += 1
_create_user(test_users_m1, user_num)
_modify_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)

user_num += 1
_create_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)
_modify_user(test_users_m1, user_num, sleep=True)

user_num += 1
_create_user(test_users_m2, user_num, sleep=True)
_create_user(test_users_m1, user_num)
_modify_user(test_users_m1, user_num)

log.info("Test add-modrdn on M1 and add on M2")
user_num += 1
_create_user(test_users_m1, user_num)
_rename_user(test_users_m1, user_num, user_num+20, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)

user_num += 1
_create_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)
_rename_user(test_users_m1, user_num, user_num+20, sleep=True)

user_num += 1
_create_user(test_users_m2, user_num, sleep=True)
_create_user(test_users_m1, user_num)
_rename_user(test_users_m1, user_num, user_num+20)

log.info("Test multiple add, modrdn")
user_num += 1
_create_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num, sleep=True)
_rename_user(test_users_m1, user_num, user_num+20)
_create_user(test_users_m1, user_num, sleep=True)
_modify_user(test_users_m2, user_num, sleep=True)

log.info("Add - del on both masters")
user_num += 1
_create_user(test_users_m1, user_num)
_delete_user(test_users_m1, user_num, sleep=True)
_create_user(test_users_m2, user_num)
_delete_user(test_users_m2, user_num, sleep=True)

log.info("Test modrdn - modrdn")
user_num += 1
_rename_user(test_users_m1, 1109, 1129, sleep=True)
_rename_user(test_users_m2, 1109, 1129, sleep=True)

log.info("Test modrdn - del")
user_num += 1
_rename_user(test_users_m1, 1100, 1120, sleep=True)
_delete_user(test_users_m2, 1100)

user_num += 1
_delete_user(test_users_m2, 1101, sleep=True)
_rename_user(test_users_m1, 1101, 1121)

topology_m2.resume_all_replicas()

repl.test_replication_topology(topology_m2)
time.sleep(30)

user_dns_m1 = [user.dn for user in test_users_m1.list()]
user_dns_m2 = [user.dn for user in test_users_m2.list()]
> assert set(user_dns_m1) == set(user_dns_m2)
E AssertionError: assert {'uid=test_us...,dc=com', ...} == {'uid=test_us...,dc=com', ...}
E Extra items in the left set:
E 'uid=test_user_1112,cn=test_container,dc=example,dc=com'
E 'uid=test_user_1111,cn=test_container,dc=example,dc=com'
E 'uid=test_user_1117,cn=test_container,dc=example,dc=com'
E Full diff:
E {
E 'uid=test_user_1102,cn=test_container,dc=example,dc=com',...
E
E ...Full output truncated (24 lines hidden), use '-vv' to show

suites/replication/conflict_resolve_test.py:369: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18329f5-f189-414f-9b66-9e7affe2a14b / got description=d01c440b-0b62-4d1c-96b4-cfee0878540b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18329f5-f189-414f-9b66-9e7affe2a14b / got description=d01c440b-0b62-4d1c-96b4-cfee0878540b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18329f5-f189-414f-9b66-9e7affe2a14b / got description=d01c440b-0b62-4d1c-96b4-cfee0878540b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18329f5-f189-414f-9b66-9e7affe2a14b / got description=d01c440b-0b62-4d1c-96b4-cfee0878540b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e18329f5-f189-414f-9b66-9e7affe2a14b / got description=d01c440b-0b62-4d1c-96b4-cfee0878540b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:285 Test add-del on M1 and add on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:301 Test add-mod on M1 and add on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:317 Test add-modrdn on M1 and add on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:333 Test multiple add, modrdn INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:341 Add - del on both masters INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:348 Test modrdn - modrdn INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:353 Test modrdn - del INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 0c268b80-f2e8-41ad-9de5-522fb814eb39 / got description=e18329f5-f189-414f-9b66-9e7affe2a14b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect f2e221f8-3b1d-4d9d-b66c-a3843bebe85a / got description=0c268b80-f2e8-41ad-9de5-522fb814eb39) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Failed suites/schema/schema_reload_test.py::test_schema_operation 2.19
topo = <lib389.topologies.TopologyMain object at 0x7f61c35d4460>

def test_schema_operation(topo):
"""Test that the cases in original schema are preserved.
Test that duplicated schema except cases are not loaded
Test to use a custom schema

:id: e7448863-ac62-4b49-b013-4efa412c0455
:setup: Standalone instance
:steps:
1. Create a test schema with cases
2. Run a schema_reload task
3. Check the attribute is present
4. Case 2: Check duplicated schema except cases are not loaded
5. Case 2-1: Use the custom schema

:expectedresults:
1. Operation should be successful
2. Operation should be successful
3. Operation should be successful
4. Operation should be successful
5. Operation should be successful
"""

log.info('case 1: Test the cases in the original schema are preserved.')

schema_filename = topo.standalone.schemadir + '/98test.ldif'
try:
with open(schema_filename, "w") as schema_file:
schema_file.write("dn: cn=schema\n")
schema_file.write("attributetypes: ( 8.9.10.11.12.13.14 NAME " +
"'MoZiLLaaTTRiBuTe' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 " +
" X-ORIGIN 'Mozilla Dummy Schema' )\n")
schema_file.write("objectclasses: ( 1.2.3.4.5.6.7 NAME 'MozillaObject' " +
"SUP top MUST ( objectclass $ cn ) MAY ( MoZiLLaaTTRiBuTe )" +
" X-ORIGIN 'user defined' )')\n")

except OSError as e:
log.fatal("Failed to create schema file: " +
"{} Error: {}".format(schema_filename, str(e)))


# run the schema reload task with the default schemadir
schema = Schema(topo.standalone)
task = schema.reload(schema_dir=topo.standalone.schemadir)
task.wait()

subschema = topo.standalone.schema.get_subschema()
at_obj = subschema.get_obj(ldap.schema.AttributeType, 'MoZiLLaaTTRiBuTe')

> assert at_obj is not None, "The attribute was not found on server"
E AssertionError: The attribute was not found on server
E assert None is not None

suites/schema/schema_reload_test.py:120: AssertionError
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:94 case 1: Test the cases in the original schema are preserved.
Failed suites/schema/schema_reload_test.py::test_valid_schema 2.02
topo = <lib389.topologies.TopologyMain object at 0x7f61c35d4460>

def test_valid_schema(topo):
"""Test schema-reload task with valid schema

:id: 2ab304c0-3e58-4d34-b23b-a14b5997c7a8
:setup: Standalone instance
:steps:
1. Create schema file with valid schema
2. Run schema-reload.pl script
3. Run ldapsearch and check if schema was added
:expectedresults:
1. File creation should work
2. The schema reload task should be successful
3. Searching the server should return the new schema
"""

log.info("Test schema-reload task with valid schema")

# Step 1 - Create schema file
log.info("Create valid schema file (99user.ldif)...")
schema_filename = (topo.standalone.schemadir + "/99user.ldif")
try:
with open(schema_filename, 'w') as schema_file:
schema_file.write("dn: cn=schema\n")
schema_file.write("attributetypes: ( 8.9.10.11.12.13.13 NAME " +
"'ValidAttribute' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15" +
" X-ORIGIN 'Mozilla Dummy Schema' )\n")
schema_file.write("objectclasses: ( 1.2.3.4.5.6.7.8 NAME 'TestObject' " +
"SUP top MUST ( objectclass $ cn ) MAY ( givenName $ " +
"sn $ ValidAttribute ) X-ORIGIN 'user defined' )')\n")
except OSError as e:
log.fatal("Failed to create schema file: " +
"{} Error: {}".format(schema_filename, str(e)))

# Step 2 - Run the schema-reload task
log.info("Run the schema-reload task...")
schema = Schema(topo.standalone)
task = schema.reload(schema_dir=topo.standalone.schemadir)
task.wait()
> assert task.get_exit_code() == 0, "The schema reload task failed"
E AssertionError: The schema reload task failed
E assert 65 == 0
E +65
E -0

suites/schema/schema_reload_test.py:207: AssertionError
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:184 Test schema-reload task with valid schema INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:187 Create valid schema file (99user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:203 Run the schema-reload task...
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie 0.00
topology = <lib389.topologies.TopologyMain object at 0x7f61c31ae730>
request = <FixtureRequest for <Function test_sync_repl_cookie>>

def test_sync_repl_cookie(topology, request):
"""Test sync_repl cookie are progressing is an increasing order
when there are nested updates

:id: d7fbde25-5702-46ac-b38e-169d7a68e97c
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (2) groups. Few groups can help to reproduce the concurrent updates problem.
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create (9) users that will generate nested updates (automember/memberof)
12.: stop sync repl client and collect the list of cookie.change_no
13.: check that cookies.change_no are in increasing order
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: succeeds
13.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:275:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7f61c3137d90>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie_add_del 0.00
topology = <lib389.topologies.TopologyMain object at 0x7f61c31ae730>
request = <FixtureRequest for <Function test_sync_repl_cookie_add_del>>

def test_sync_repl_cookie_add_del(topology, request):
"""Test sync_repl cookie are progressing is an increasing order
when there add and del

:id: 83e11038-6ed0-4a5b-ac77-e44887ab11e3
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (2) groups. Few groups can help to reproduce the concurrent updates problem.
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create (3) users that will generate nested updates (automember/memberof)
12.: Delete (3) users
13.: stop sync repl client and collect the list of cookie.change_no
14.: check that cookies.change_no are in increasing order
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: succeeds
13.: succeeds
14.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:407:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7f61c315f3d0>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed suites/syncrepl_plugin/basic_test.py::test_sync_repl_cookie_with_failure 0.00
topology = <lib389.topologies.TopologyMain object at 0x7f61c31ae730>
request = <FixtureRequest for <Function test_sync_repl_cookie_with_failure>>

def test_sync_repl_cookie_with_failure(topology, request):
"""Test sync_repl cookie are progressing is the right order
when there is a failure in nested updates

:id: e0103448-170e-4080-8f22-c34606447ce2
:setup: Standalone Instance
:steps:
1.: enable retroCL
2.: configure retroCL to log nsuniqueid as targetUniqueId
3.: enable content_sync plugin
4.: enable automember
5.: create (4) groups.
make group2 groupOfUniqueNames so the automember
will fail to add 'member' (uniqueMember expected)
6.: configure automember to provision those groups with 'member'
7.: enable and configure memberof plugin
8.: enable plugin log level
9.: restart the server
10.: create a thread dedicated to run a sync repl client
11.: Create a group that will be the only update received by sync repl client
12.: Create (9) users that will generate nested updates (automember/memberof)
13.: stop sync repl client and collect the list of cookie.change_no
14.: check that the list of cookie.change_no contains only the group 'step 11'
:expectedresults:
1.: succeeds
2.: succeeds
3.: succeeds
4.: succeeds
5.: succeeds
6.: succeeds
7.: succeeds
8.: succeeds
9.: succeeds
10.: succeeds
11.: succeeds
12.: Fails (expected)
13.: succeeds
14.: succeeds
"""
inst = topology[0]

# Enable/configure retroCL
plugin = RetroChangelogPlugin(inst)
> plugin.disable()

suites/syncrepl_plugin/basic_test.py:539:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:63: in disable
self.set('nsslapd-pluginEnabled', 'off')
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.plugins.RetroChangelogPlugin object at 0x7f61c321c370>
key = 'nsslapd-pluginEnabled', value = 'off', action = 2

def set(self, key, value, action=ldap.MOD_REPLACE):
"""Perform a specified action on a key with value

:param key: an attribute name
:type key: str
:param value: an attribute value
:type value: str
:param action: - ldap.MOD_REPLACE - by default
- ldap.MOD_ADD
- ldap.MOD_DELETE
:type action: int

:returns: result of modify_s operation
:raises: ValueError - if instance is not online
"""

if action == ldap.MOD_ADD:
action_txt = "ADD"
elif action == ldap.MOD_REPLACE:
action_txt = "REPLACE"
elif action == ldap.MOD_DELETE:
action_txt = "DELETE"
else:
# This should never happen (bug!)
action_txt = "UNKNOWN"

if value is None or len(value) < 512:
self._log.debug("%s set %s: (%r, %r)" % (self._dn, action_txt, key, display_log_value(key, value)))
else:
self._log.debug("%s set %s: (%r, value too large)" % (self._dn, action_txt, key))
if self._instance.state != DIRSRV_STATE_ONLINE:
> raise ValueError("Invalid state. Cannot set properties on instance that is not ONLINE.")
E ValueError: Invalid state. Cannot set properties on instance that is not ONLINE.

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:438: ValueError
Failed suites/vlv/regression_test.py::test_bulk_import_when_the_backend_with_vlv_was_recreated 0.42
self = <lib389.mappingTree.MappingTreeLegacy object at 0x7f61c33aad00>
suffix = 'dc=example,dc=com', bename = 'userRoot', parent = None

def create(self, suffix=None, bename=None, parent=None):
'''
Create a mapping tree entry (under "cn=mapping tree,cn=config"),
for the 'suffix' and that is stored in 'bename' backend.
'bename' backend must exist before creating the mapping tree entry.

If a 'parent' is provided that means that we are creating a
sub-suffix mapping tree.

@param suffix - suffix mapped by this mapping tree entry. It will
be the common name ('cn') of the entry
@param benamebase - backend common name (e.g. 'userRoot')
@param parent - if provided is a parent suffix of 'suffix'

@return DN of the mapping tree entry

@raise ldap.NO_SUCH_OBJECT - if the backend entry or parent mapping
tree does not exist
ValueError - if missing a parameter,

'''
# Check suffix is provided
if not suffix:
raise ValueError("suffix is mandatory")
else:
nsuffix = normalizeDN(suffix)

# Check backend name is provided
if not bename:
raise ValueError("backend name is mandatory")

# Check that if the parent suffix is provided then
# it exists a mapping tree for it
if parent:
nparent = normalizeDN(parent)
filt = suffixfilt(parent)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
pass
except NoSuchEntryError:
raise ValueError("parent suffix has no mapping tree")
else:
nparent = ""

# Check if suffix exists, return
filt = suffixfilt(suffix)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
return entry
except ldap.NO_SUCH_OBJECT:
entry = None

#
# Now start the real work
#

# fix me when we can actually used escaped DNs
dn = ','.join(('cn="%s"' % nsuffix, DN_MAPPING_TREE))
entry = Entry(dn)
entry.update({
'objectclass': ['top', 'extensibleObject', MT_OBJECTCLASS_VALUE],
'nsslapd-state': 'backend',
# the value in the dn has to be DN escaped
# internal code will add the quoted value - unquoted value is
# useful for searching.
MT_PROPNAME_TO_ATTRNAME[MT_SUFFIX]: nsuffix,
MT_PROPNAME_TO_ATTRNAME[MT_BACKEND]: bename
})

# possibly add the parent
if parent:
entry.setValues(MT_PROPNAME_TO_ATTRNAME[MT_PARENT_SUFFIX], nparent)

try:
self.log.debug("Creating entry: %s", entry.dn)
self.log.info("Entry %r", entry)
> self.conn.add_s(entry)

/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py:155:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (dn: cn="dc=example,dc=com",cn=mapping tree,cn=config
cn: dc=example,dc=com
nsslapd-backend: userRoot
nsslapd-state: backend
objectclass: top
objectclass: extensibleObject
objectclass: nsMappingTree

,)
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x7f61c33c6040, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 176,...mbda>', code_context=[' self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b77742c0, file '/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py', line 15.../lib389/mappingTree.py', lineno=155, function='create', code_context=[' self.conn.add_s(entry)\n'], index=0)
ent = dn: cn="dc=example,dc=com",cn=mapping tree,cn=config
cn: dc=example,dc=com
nsslapd-backend: userRoot
nsslapd-state: backend
objectclass: top
objectclass: extensibleObject
objectclass: nsMappingTree



def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
> return f(ent.dn, ent.toTupleList(), *args[2:])

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>
dn = 'cn="dc=example,dc=com",cn=mapping tree,cn=config'
modlist = [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=example,dc=com']), ('nsslapd-backend', [b'userRoot'])]

def add_s(self,dn,modlist):
> return self.add_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('cn="dc=example,dc=com",cn=mapping tree,cn=config', [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=example,dc=com']), ('nsslapd-backend', [b'userRoot'])], None, None)
kwargs = {}, ent = 'cn="dc=example,dc=com",cn=mapping tree,cn=config'

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:178:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>
dn = 'cn="dc=example,dc=com",cn=mapping tree,cn=config'
modlist = [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=example,dc=com']), ('nsslapd-backend', [b'userRoot'])]
serverctrls = None, clientctrls = None

def add_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.add_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (76,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>, msgid = 76, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (76, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>, msgid = 76, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c35288d0>, 76, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>
func = <built-in method result4 of LDAP object at 0x7f61c35288d0>
args = (76, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.UNWILLING_TO_PERFORM'>
exc_value = UNWILLING_TO_PERFORM({'msgtype': 105, 'msgid': 76, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []})
exc_traceback = <traceback object at 0x7f61c3272080>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c3376580>
func = <built-in method result4 of LDAP object at 0x7f61c35288d0>
args = (76, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.UNWILLING_TO_PERFORM: {'msgtype': 105, 'msgid': 76, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: UNWILLING_TO_PERFORM

During handling of the above exception, another exception occurred:

topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2f88340>

@pytest.mark.DS47966
def test_bulk_import_when_the_backend_with_vlv_was_recreated(topology_m2):
"""
Testing bulk import when the backend with VLV was recreated.
If the test passes without the server crash, 47966 is verified.

:id: 512963fa-fe02-11e8-b1d3-8c16451d917b
:setup: Replication with two masters.
:steps:
1. Generate vlvSearch entry
2. Generate vlvIndex entry
3. Delete the backend instance on Master 2
4. Delete the agreement, replica, and mapping tree, too.
5. Recreate the backend and the VLV index on Master 2.
6. Recreating vlvSrchDn and vlvIndexDn on Master 2.
:expectedresults:
1. Should Success.
2. Should Success.
3. Should Success.
4. Should Success.
5. Should Success.
6. Should Success.
"""
M1 = topology_m2.ms["master1"]
M2 = topology_m2.ms["master2"]
# generate vlvSearch entry
properties_for_search = {
"objectclass": ["top", "vlvSearch"],
"cn": "vlvSrch",
"vlvbase": DEFAULT_SUFFIX,
"vlvfilter": "(|(objectclass=*)(objectclass=ldapsubentry))",
"vlvscope": "2",
}
vlv_searches = VLVSearch(M2)
userroot_vlvsearch = vlv_searches.create(
basedn="cn=userRoot,cn=ldbm database,cn=plugins,cn=config",
properties=properties_for_search,
)
assert "cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config" in M2.getEntry(
"cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config").dn
# generate vlvIndex entry
properties_for_index = {
"objectclass": ["top", "vlvIndex"],
"cn": "vlvIdx",
"vlvsort": "cn ou sn",
}
vlv_index = VLVIndex(M2)
userroot_index = vlv_index.create(
basedn="cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config",
properties=properties_for_index,
)
assert "cn=vlvIdx,cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config" in M2.getEntry(
"cn=vlvIdx,cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config").dn
# Delete the backend instance on Master 2."
userroot_index.delete()
userroot_vlvsearch.delete_all()
# delete the agreement, replica, and mapping tree, too.
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.remove_master(M2)
MappingTrees(M2).list()[0].delete()
Backends(M2).list()[0].delete()
# Recreate the backend and the VLV index on Master 2.
> M2.mappingtree.create(DEFAULT_SUFFIX, "userRoot")

suites/vlv/regression_test.py:87:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.mappingTree.MappingTreeLegacy object at 0x7f61c33aad00>
suffix = 'dc=example,dc=com', bename = 'userRoot', parent = None

def create(self, suffix=None, bename=None, parent=None):
'''
Create a mapping tree entry (under "cn=mapping tree,cn=config"),
for the 'suffix' and that is stored in 'bename' backend.
'bename' backend must exist before creating the mapping tree entry.

If a 'parent' is provided that means that we are creating a
sub-suffix mapping tree.

@param suffix - suffix mapped by this mapping tree entry. It will
be the common name ('cn') of the entry
@param benamebase - backend common name (e.g. 'userRoot')
@param parent - if provided is a parent suffix of 'suffix'

@return DN of the mapping tree entry

@raise ldap.NO_SUCH_OBJECT - if the backend entry or parent mapping
tree does not exist
ValueError - if missing a parameter,

'''
# Check suffix is provided
if not suffix:
raise ValueError("suffix is mandatory")
else:
nsuffix = normalizeDN(suffix)

# Check backend name is provided
if not bename:
raise ValueError("backend name is mandatory")

# Check that if the parent suffix is provided then
# it exists a mapping tree for it
if parent:
nparent = normalizeDN(parent)
filt = suffixfilt(parent)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
pass
except NoSuchEntryError:
raise ValueError("parent suffix has no mapping tree")
else:
nparent = ""

# Check if suffix exists, return
filt = suffixfilt(suffix)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
return entry
except ldap.NO_SUCH_OBJECT:
entry = None

#
# Now start the real work
#

# fix me when we can actually used escaped DNs
dn = ','.join(('cn="%s"' % nsuffix, DN_MAPPING_TREE))
entry = Entry(dn)
entry.update({
'objectclass': ['top', 'extensibleObject', MT_OBJECTCLASS_VALUE],
'nsslapd-state': 'backend',
# the value in the dn has to be DN escaped
# internal code will add the quoted value - unquoted value is
# useful for searching.
MT_PROPNAME_TO_ATTRNAME[MT_SUFFIX]: nsuffix,
MT_PROPNAME_TO_ATTRNAME[MT_BACKEND]: bename
})

# possibly add the parent
if parent:
entry.setValues(MT_PROPNAME_TO_ATTRNAME[MT_PARENT_SUFFIX], nparent)

try:
self.log.debug("Creating entry: %s", entry.dn)
self.log.info("Entry %r", entry)
self.conn.add_s(entry)
except ldap.LDAPError as e:
> raise ldap.LDAPError("Error adding suffix entry " + dn, e)
E ldap.LDAPError: ('Error adding suffix entry cn="dc=example,dc=com",cn=mapping tree,cn=config', UNWILLING_TO_PERFORM({'msgtype': 105, 'msgid': 76, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []}))

/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py:157: LDAPError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 78c65298-0dbf-4d53-984b-524f7cca4636 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect e75dc12f-6352-4b80-be1e-fc0db95634b6 / got description=78c65298-0dbf-4d53-984b-524f7cca4636) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
------------------------------Captured stdout call------------------------------
deleting vlv search: cn=vlvSrch,cn=userRoot,cn=ldbm database,cn=plugins,cn=config deleting vlv search entry...
-------------------------------Captured log call--------------------------------
INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=example,dc=com",cn=mapping tree,cn=config cn: dc=example,dc=com nsslapd-backend: userRoot nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree
Failed tickets/ticket47781_test.py::test_ticket47781 3.66
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c3179ac0>

def test_ticket47781(topology_st):
"""
Testing for a deadlock after doing an online import of an LDIF with
replication data. The replication agreement should be invalid.
"""

log.info('Testing Ticket 47781 - Testing for deadlock after importing LDIF with replication data')

master = topology_st.standalone
repl = ReplicationManager(DEFAULT_SUFFIX)
repl.create_first_master(master)

properties = {RA_NAME: r'meTo_$host:$port',
RA_BINDDN: defaultProperties[REPLICATION_BIND_DN],
RA_BINDPW: defaultProperties[REPLICATION_BIND_PW],
RA_METHOD: defaultProperties[REPLICATION_BIND_METHOD],
RA_TRANSPORT_PROT: defaultProperties[REPLICATION_TRANSPORT]}
# The agreement should point to a server that does NOT exist (invalid port)
repl_agreement = master.agreement.create(suffix=DEFAULT_SUFFIX,
host=master.host,
port=5555,
properties=properties)

#
# add two entries
#
log.info('Adding two entries...')

master.add_s(Entry(('cn=entry1,dc=example,dc=com', {
'objectclass': 'top person'.split(),
'sn': 'user',
'cn': 'entry1'})))

master.add_s(Entry(('cn=entry2,dc=example,dc=com', {
'objectclass': 'top person'.split(),
'sn': 'user',
'cn': 'entry2'})))

#
# export the replication ldif
#
log.info('Exporting replication ldif...')
args = {EXPORT_REPL_INFO: True}
exportTask = Tasks(master)
exportTask.exportLDIF(DEFAULT_SUFFIX, None, "/tmp/export.ldif", args)

#
# Restart the server
#
log.info('Restarting server...')
master.stop()
master.start()

#
# Import the ldif
#
log.info('Import replication LDIF file...')
importTask = Tasks(master)
args = {TASK_WAIT: True}
> importTask.importLDIF(DEFAULT_SUFFIX, None, "/tmp/export.ldif", args)

tickets/ticket47781_test.py:85:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.tasks.Tasks object at 0x7f61c2e50a60>
suffix = 'dc=example,dc=com', benamebase = None, input_file = '/tmp/export.ldif'
args = {'wait': True}

def importLDIF(self, suffix=None, benamebase=None, input_file=None,
args=None):
'''
Import from a LDIF format a given 'suffix' (or 'benamebase' that stores
that suffix). It uses an internal task to acheive this request.

If 'suffix' and 'benamebase' are specified, it uses 'benamebase' first
else 'suffix'.
If both 'suffix' and 'benamebase' are missing it raise ValueError

'input_file' is the ldif input file

@param suffix - suffix of the backend
@param benamebase - 'commonname'/'cn' of the backend (e.g. 'userRoot')
@param ldif_input - file that will contain the entries in LDIF format
to import
@param args - is a dictionary that contains modifier of the import task
wait: True/[False] - If True, 'export' waits for the completion
of the task before to return

@return None

@raise ValueError

'''
if self.conn.state != DIRSRV_STATE_ONLINE:
raise ValueError("Invalid Server State %s! Must be online" % self.conn.state)

# Checking the parameters
if not benamebase and not suffix:
raise ValueError("Specify either bename or suffix")

if not input_file:
raise ValueError("input_file is mandatory")

if not os.path.exists(input_file):
> raise ValueError("Import file (%s) does not exist" % input_file)
E ValueError: Import file (/tmp/export.ldif) does not exist

/usr/local/lib/python3.8/site-packages/lib389/tasks.py:473: ValueError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:567 Export task export_10302020_234840 for file /tmp/export.ldif completed successfully
Failed tickets/ticket47988_test.py::test_ticket47988_init 6.85
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_init(topology_m2):
"""
It adds
- Objectclass with MAY 'member'
- an entry ('bind_entry') with which we bind to test the 'SELFDN' operation
It deletes the anonymous aci

"""

_header(topology_m2, 'test_ticket47988_init')

# enable acl error logging
mod = [(ldap.MOD_REPLACE, 'nsslapd-errorlog-level', ensure_bytes(str(8192)))] # REPL
topology_m2.ms["master1"].modify_s(DN_CONFIG, mod)
topology_m2.ms["master2"].modify_s(DN_CONFIG, mod)

mod = [(ldap.MOD_REPLACE, 'nsslapd-accesslog-level', ensure_bytes(str(260)))] # Internal op
topology_m2.ms["master1"].modify_s(DN_CONFIG, mod)
topology_m2.ms["master2"].modify_s(DN_CONFIG, mod)

# add dummy entries
for cpt in range(MAX_OTHERS):
name = "%s%d" % (OTHER_NAME, cpt)
topology_m2.ms["master1"].add_s(Entry(("cn=%s,%s" % (name, SUFFIX), {
'objectclass': "top person".split(),
'sn': name,
'cn': name})))

# check that entry 0 is replicated before
loop = 0
entryDN = "cn=%s0,%s" % (OTHER_NAME, SUFFIX)
while loop <= 10:
try:
ent = topology_m2.ms["master2"].getEntry(entryDN, ldap.SCOPE_BASE, "(objectclass=*)", ['telephonenumber'])
break
except ldap.NO_SUCH_OBJECT:
time.sleep(1)
loop += 1
assert (loop <= 10)

topology_m2.ms["master1"].stop(timeout=10)
topology_m2.ms["master2"].stop(timeout=10)

# install the specific schema M1: ipa3.3, M2: ipa4.1
schema_file = os.path.join(topology_m2.ms["master1"].getDir(__file__, DATA_DIR), "ticket47988/schema_ipa3.3.tar.gz")
_install_schema(topology_m2.ms["master1"], schema_file)
schema_file = os.path.join(topology_m2.ms["master1"].getDir(__file__, DATA_DIR), "ticket47988/schema_ipa4.1.tar.gz")
_install_schema(topology_m2.ms["master2"], schema_file)

> topology_m2.ms["master1"].start(timeout=10)

/export/tests/tickets/ticket47988_test.py:157:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:1079: in start
subprocess.check_output(["systemctl", "start", "dirsrv@%s" % self.serverid], stderr=subprocess.STDOUT)
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

input = None, capture_output = False, timeout = None, check = True
popenargs = (['systemctl', 'start', 'dirsrv@master1'],)
kwargs = {'stderr': -2, 'stdout': -1}
process = <subprocess.Popen object at 0x7f61c2936520>
stdout = b'Job for dirsrv@master1.service failed because the control process exited with error code.\nSee "systemctl status dirsrv@master1.service" and "journalctl -xe" for details.\n'
stderr = None, retcode = 1

def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.

The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.

If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.

There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.

By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.

The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE

if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE

with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
E subprocess.CalledProcessError: Command '['systemctl', 'start', 'dirsrv@master1']' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:512: CalledProcessError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e48ab3df-de91-4756-bac3-704e8058a247 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 70a666e1-1277-440f-9a6c-eeaf77a1f458 / got description=e48ab3df-de91-4756-bac3-704e8058a247) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_init INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ################################################### INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/02common.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-admin.ldif INFO  lib389:ticket47988_test.py:98 replace /etc/dirsrv/slapd-master1/schema/99user.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60nss-ldap.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60autofs.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-web.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60samba.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/10dna-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/05rfc4523.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60basev2.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/10automember-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/05rfc2927.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/10mep-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60ipadns.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/10rfc2307.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-mail.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/05rfc4524.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60trust.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60ipaconfig.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-directory.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60eduperson.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60mozilla.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/65ipasudo.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60rfc3712.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60rfc2739.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-value.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60acctpolicy.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/01core389.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60sabayon.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60pam-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/00core.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/25java-object.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60sudo.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/70ipaotp.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60pureftpd.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/61kerberos-ipav3.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60kerberos.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60basev3.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/06inetorgperson.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/30ns-common.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/28pilot.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/20subscriber.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/50ns-certificate.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master1/schema/60posix-winsync-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/02common.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-admin.ldif INFO  lib389:ticket47988_test.py:98 replace /etc/dirsrv/slapd-master2/schema/99user.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60nss-ldap.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60autofs.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-web.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60samba.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/10dna-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/05rfc4523.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60basev2.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/10automember-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/05rfc2927.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/10mep-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60ipadns.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/10rfc2307.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-mail.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/05rfc4524.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60trust.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60ipaconfig.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-directory.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60eduperson.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60mozilla.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/65ipasudo.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60rfc3712.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60rfc2739.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-value.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60acctpolicy.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/01core389.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60sabayon.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60pam-plugin.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/00core.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/25java-object.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60sudo.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/70ipaotp.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60pureftpd.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/61kerberos-ipav3.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60kerberos.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60basev3.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/06inetorgperson.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/30ns-common.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/28pilot.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/20subscriber.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/50ns-certificate.ldif INFO  lib389:ticket47988_test.py:102 add /etc/dirsrv/slapd-master2/schema/60posix-winsync-plugin.ldif
Failed tickets/ticket47988_test.py::test_ticket47988_1 0.00
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_1(topology_m2):
'''
Check that replication is working and pause replication M2->M1
'''
_header(topology_m2, 'test_ticket47988_1')

topology_m2.ms["master1"].log.debug("\n\nCheck that replication is working and pause replication M2->M1\n")
> _do_update_entry(supplier=topology_m2.ms["master2"], consumer=topology_m2.ms["master1"], attempts=5)

/export/tests/tickets/ticket47988_test.py:234:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket47988_test.py:184: in _do_update_entry
supplier.modify_s(entryDN, mod)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640: in modify_s
return self.modify_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2939460>
func = <built-in method result4 of LDAP object at 0x7f61c28d8f30>
args = (26, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_1 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ###################################################
Failed tickets/ticket47988_test.py::test_ticket47988_2 0.00
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_2(topology_m2):
'''
Update M1 schema and trigger update M1->M2
So M1 should learn new/extended definitions that are in M2 schema
'''
_header(topology_m2, 'test_ticket47988_2')

topology_m2.ms["master1"].log.debug("\n\nUpdate M1 schema and an entry on M1\n")
> master1_schema_csn = topology_m2.ms["master1"].schema.get_schema_csn()

/export/tests/tickets/ticket47988_test.py:246:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/schema.py:604: in get_schema_csn
ents = self.conn.search_s(DN_SCHEMA, ldap.SCOPE_BASE,
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:870: in search_s
return self.search_ext_s(base,scope,filterstr,attrlist,attrsonly,None,None,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:864: in search_ext_s
return self.result(msgid,all=1,timeout=timeout)[1]
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:148: in inner
objtype, data = f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:756: in result
resp_type, resp_data, resp_msgid = self.result2(msgid,all,timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:760: in result2
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all,timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c297d4f0>
func = <built-in method result4 of LDAP object at 0x7f61c290f1b0>
args = (62, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_2 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ###################################################
Failed tickets/ticket47988_test.py::test_ticket47988_3 0.01
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_3(topology_m2):
'''
Resume replication M2->M1 and check replication is still working
'''
_header(topology_m2, 'test_ticket47988_3')

> _resume_M2_to_M1(topology_m2)

/export/tests/tickets/ticket47988_test.py:283:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket47988_test.py:222: in _resume_M2_to_M1
ents = topology_m2.ms["master2"].agreement.list(suffix=SUFFIX)
/usr/local/lib/python3.8/site-packages/lib389/agreement.py:905: in list
replica_entries = self.conn.replica.list(suffix)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:178: in list
ents = self.conn.search_s(base, ldap.SCOPE_SUBTREE, filtr)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:870: in search_s
return self.search_ext_s(base,scope,filterstr,attrlist,attrsonly,None,None,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:863: in search_ext_s
msgid = self.search_ext(base,scope,filterstr,attrlist,attrsonly,serverctrls,clientctrls,timeout,sizelimit)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:853: in search_ext
return self._ldap_call(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2939460>
func = <built-in method search_ext of LDAP object at 0x7f61c28d8f30>
args = ('cn=mapping tree,cn=config', 2, '(&(objectclass=nsds5Replica)(nsDS5ReplicaRoot=dc=example,dc=com))', None, 0, None, ...)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_3 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ################################################### INFO  lib389:ticket47988_test.py:221 ######################### resume RA M2->M1 ######################
Failed tickets/ticket47988_test.py::test_ticket47988_4 0.01
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_4(topology_m2):
'''
Check schemaCSN is identical on both server
And save the nsschemaCSN to later check they do not change unexpectedly
'''
_header(topology_m2, 'test_ticket47988_4')

> master1_schema_csn = topology_m2.ms["master1"].schema.get_schema_csn()

/export/tests/tickets/ticket47988_test.py:295:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/schema.py:604: in get_schema_csn
ents = self.conn.search_s(DN_SCHEMA, ldap.SCOPE_BASE,
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:870: in search_s
return self.search_ext_s(base,scope,filterstr,attrlist,attrsonly,None,None,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:863: in search_ext_s
msgid = self.search_ext(base,scope,filterstr,attrlist,attrsonly,serverctrls,clientctrls,timeout,sizelimit)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:853: in search_ext
return self._ldap_call(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c297d4f0>
func = <built-in method search_ext of LDAP object at 0x7f61c290f1b0>
args = ('cn=schema', 0, 'objectclass=*', ['nsSchemaCSN'], 0, None, ...)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_4 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ###################################################
Failed tickets/ticket47988_test.py::test_ticket47988_5 0.00
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_5(topology_m2):
'''
Check schemaCSN do not change unexpectedly
'''
_header(topology_m2, 'test_ticket47988_5')

> _do_update_entry(supplier=topology_m2.ms["master1"], consumer=topology_m2.ms["master2"], attempts=5)

/export/tests/tickets/ticket47988_test.py:313:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket47988_test.py:184: in _do_update_entry
supplier.modify_s(entryDN, mod)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640: in modify_s
return self.modify_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:612: in modify_ext_s
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:609: in modify_ext
return self._ldap_call(self._l.modify_ext,dn,modlist,RequestControlTuples(serverctrls),RequestControlTuples(clientctrls))
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c297d4f0>
func = <built-in method modify_ext of LDAP object at 0x7f61c290f1b0>
args = ('cn=other_entry0,dc=example,dc=com', [(2, 'telephonenumber', b'178')], None, None)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_5 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ###################################################
Failed tickets/ticket47988_test.py::test_ticket47988_6 0.00
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c297d550>

def test_ticket47988_6(topology_m2):
'''
Update M1 schema and trigger update M2->M1
So M2 should learn new/extended definitions that are in M1 schema
'''

_header(topology_m2, 'test_ticket47988_6')

topology_m2.ms["master1"].log.debug("\n\nUpdate M1 schema and an entry on M1\n")
> master1_schema_csn = topology_m2.ms["master1"].schema.get_schema_csn()

/export/tests/tickets/ticket47988_test.py:336:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/schema.py:604: in get_schema_csn
ents = self.conn.search_s(DN_SCHEMA, ldap.SCOPE_BASE,
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:870: in search_s
return self.search_ext_s(base,scope,filterstr,attrlist,attrsonly,None,None,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:863: in search_ext_s
msgid = self.search_ext(base,scope,filterstr,attrlist,attrsonly,serverctrls,clientctrls,timeout,sizelimit)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:853: in search_ext
return self._ldap_call(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c297d4f0>
func = <built-in method search_ext of LDAP object at 0x7f61c290f1b0>
args = ('cn=schema', 0, 'objectclass=*', ['nsSchemaCSN'], 0, None, ...)
kwargs = {}, diagnostic_message_success = None, exc_type = None
exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47988_test.py:64 ############################################### INFO  lib389:ticket47988_test.py:65 ####### INFO  lib389:ticket47988_test.py:66 ####### test_ticket47988_6 INFO  lib389:ticket47988_test.py:67 ####### INFO  lib389:ticket47988_test.py:68 ###################################################
Failed tickets/ticket48005_test.py::test_ticket48005_setup 4.71
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c285ed60>

def test_ticket48005_setup(topology_st):
'''
allow dump core
generate a test ldif file using dbgen.pl
import the ldif
'''
log.info("Ticket 48005 setup...")
if hasattr(topology_st.standalone, 'prefix'):
prefix = topology_st.standalone.prefix
else:
prefix = None
sysconfig_dirsrv = os.path.join(topology_st.standalone.get_initconfig_dir(), 'dirsrv')
cmdline = 'egrep "ulimit -c unlimited" %s' % sysconfig_dirsrv
p = os.popen(cmdline, "r")
ulimitc = p.readline()
if ulimitc == "":
log.info('No ulimit -c in %s' % sysconfig_dirsrv)
log.info('Adding it')
cmdline = 'echo "ulimit -c unlimited" >> %s' % sysconfig_dirsrv

sysconfig_dirsrv_systemd = sysconfig_dirsrv + ".systemd"
cmdline = 'egrep LimitCORE=infinity %s' % sysconfig_dirsrv_systemd
p = os.popen(cmdline, "r")
lcore = p.readline()
if lcore == "":
log.info('No LimitCORE in %s' % sysconfig_dirsrv_systemd)
log.info('Adding it')
cmdline = 'echo LimitCORE=infinity >> %s' % sysconfig_dirsrv_systemd

topology_st.standalone.restart(timeout=10)

ldif_file = topology_st.standalone.get_ldif_dir() + "/ticket48005.ldif"
os.system('ls %s' % ldif_file)
os.system('rm -f %s' % ldif_file)
if hasattr(topology_st.standalone, 'prefix'):
prefix = topology_st.standalone.prefix
else:
prefix = ""
dbgen_prog = prefix + '/bin/dbgen.pl'
log.info('dbgen_prog: %s' % dbgen_prog)
os.system('%s -s %s -o %s -u -n 10000' % (dbgen_prog, SUFFIX, ldif_file))
cmdline = 'egrep dn: %s | wc -l' % ldif_file
p = os.popen(cmdline, "r")
dnnumstr = p.readline()
num = int(dnnumstr)
log.info("We have %d entries.\n", num)

importTask = Tasks(topology_st.standalone)
args = {TASK_WAIT: True}
> importTask.importLDIF(SUFFIX, None, ldif_file, args)

/export/tests/tickets/ticket48005_test.py:74:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.tasks.Tasks object at 0x7f61c28617c0>
suffix = 'dc=example,dc=com', benamebase = None
input_file = '/var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif'
args = {'wait': True}

def importLDIF(self, suffix=None, benamebase=None, input_file=None,
args=None):
'''
Import from a LDIF format a given 'suffix' (or 'benamebase' that stores
that suffix). It uses an internal task to acheive this request.

If 'suffix' and 'benamebase' are specified, it uses 'benamebase' first
else 'suffix'.
If both 'suffix' and 'benamebase' are missing it raise ValueError

'input_file' is the ldif input file

@param suffix - suffix of the backend
@param benamebase - 'commonname'/'cn' of the backend (e.g. 'userRoot')
@param ldif_input - file that will contain the entries in LDIF format
to import
@param args - is a dictionary that contains modifier of the import task
wait: True/[False] - If True, 'export' waits for the completion
of the task before to return

@return None

@raise ValueError

'''
if self.conn.state != DIRSRV_STATE_ONLINE:
raise ValueError("Invalid Server State %s! Must be online" % self.conn.state)

# Checking the parameters
if not benamebase and not suffix:
raise ValueError("Specify either bename or suffix")

if not input_file:
raise ValueError("input_file is mandatory")

if not os.path.exists(input_file):
> raise ValueError("Import file (%s) does not exist" % input_file)
E ValueError: Import file (/var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif) does not exist

/usr/local/lib/python3.8/site-packages/lib389/tasks.py:473: ValueError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
grep: /etc/sysconfig/dirsrv: No such file or directory grep: /etc/sysconfig/dirsrv.systemd: No such file or directory ls: cannot access '/var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif': No such file or directory sh: /bin/dbgen.pl: No such file or directory grep: /var/lib/dirsrv/slapd-standalone1/ldif/ticket48005.ldif: No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:31 Ticket 48005 setup... INFO  tests.tickets.ticket48005_test:ticket48005_test.py:41 No ulimit -c in /etc/sysconfig/dirsrv INFO  tests.tickets.ticket48005_test:ticket48005_test.py:42 Adding it INFO  tests.tickets.ticket48005_test:ticket48005_test.py:50 No LimitCORE in /etc/sysconfig/dirsrv.systemd INFO  tests.tickets.ticket48005_test:ticket48005_test.py:51 Adding it INFO  tests.tickets.ticket48005_test:ticket48005_test.py:64 dbgen_prog: /bin/dbgen.pl INFO  tests.tickets.ticket48005_test:ticket48005_test.py:70 We have 0 entries.
Failed tickets/ticket48013_test.py::test_ticket48013 1.75
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c28d2760>

def test_ticket48013(topology_st):
'''
Content Synchonization: Test that invalid cookies are caught
'''

cookies = ('#', '##', 'a#a#a', 'a#a#1')

# Enable dynamic plugins
try:
topology_st.standalone.modify_s(DN_CONFIG, [(ldap.MOD_REPLACE, 'nsslapd-dynamic-plugins', b'on')])
except ldap.LDAPError as e:
log.error('Failed to enable dynamic plugin! {}'.format(e.args[0]['desc']))
assert False

# Enable retro changelog
topology_st.standalone.plugins.enable(name=PLUGIN_RETRO_CHANGELOG)

# Enbale content sync plugin
> topology_st.standalone.plugins.enable(name=PLUGIN_REPL_SYNC)

/export/tests/tickets/ticket48013_test.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:2105: in enable
plugin.enable()
/usr/local/lib/python3.8/site-packages/lib389/plugins.py:58: in enable
self.set('nsslapd-pluginEnabled', 'on')
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c28d2730>
func = <built-in method result4 of LDAP object at 0x7f61c28c3300>
args = (7, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.SERVER_DOWN: {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: SERVER_DOWN
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket48194_test.py::test_run_1 7.36
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_1(topology_st):
"""
Check nsSSL3Ciphers: +all
All ciphers are enabled except null.
Note: default allowWeakCipher (i.e., off) for +all
"""
_header(topology_st, 'Test Case 2 - Check the ciphers availability for "+all" with default allowWeakCiphers')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(CONFIG_DN, [(ldap.MOD_REPLACE, 'nsslapd-errorlog-level', b'64')])
# Make sure allowWeakCipher is not set.
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_DELETE, 'allowWeakCipher', None)])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_0' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

/export/tests/tickets/ticket48194_test.py:158:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 2 - Check the ciphers availability for "+all" with default allowWeakCiphers INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:151 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_2 6.82
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_2(topology_st):
"""
Check nsSSL3Ciphers: +rsa_aes_128_sha,+rsa_aes_256_sha
rsa_aes_128_sha, tls_rsa_aes_128_sha, rsa_aes_256_sha, tls_rsa_aes_256_sha are enabled.
default allowWeakCipher
"""
_header(topology_st,
'Test Case 3 - Check the ciphers availability for "+rsa_aes_128_sha,+rsa_aes_256_sha" with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN,
[(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'+rsa_aes_128_sha,+rsa_aes_256_sha')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_1' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)
connectWithOpenssl(topology_st, 'AES256-SHA256', False)
> connectWithOpenssl(topology_st, 'AES128-SHA', True)

/export/tests/tickets/ticket48194_test.py:184:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'AES128-SHA', expect = True

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:108: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 3 - Check the ciphers availability for "+rsa_aes_128_sha,+rsa_aes_256_sha" with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:175 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES128-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES128-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Failed tickets/ticket48194_test.py::test_run_4 7.25
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_4(topology_st):
"""
Check no nsSSL3Ciphers
Default ciphers are enabled.
default allowWeakCipher
"""
_header(topology_st, 'Test Case 5 - Check no nsSSL3Ciphers (-all) with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_DELETE, 'nsSSL3Ciphers', b'-all')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_3' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

/export/tests/tickets/ticket48194_test.py:228:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 5 - Check no nsSSL3Ciphers (-all) with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:221 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_5 7.10
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_5(topology_st):
"""
Check nsSSL3Ciphers: default
Default ciphers are enabled.
default allowWeakCipher
"""
_header(topology_st, 'Test Case 6 - Check default nsSSL3Ciphers (default setting) with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'default')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_4' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

/export/tests/tickets/ticket48194_test.py:250:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 6 - Check default nsSSL3Ciphers (default setting) with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:243 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_6 7.16
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_6(topology_st):
"""
Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256
All ciphers are disabled.
default allowWeakCipher
"""
_header(topology_st,
'Test Case 7 - Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256 with default allowWeakCipher')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN,
[(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'+all,-TLS_RSA_WITH_AES_256_CBC_SHA256')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_5' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

/export/tests/tickets/ticket48194_test.py:274:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 7 - Check nsSSL3Ciphers: +all,-TLS_RSA_WITH_AES_256_CBC_SHA256 with default allowWeakCipher INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:267 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48194_test.py::test_run_8 7.50
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>

def test_run_8(topology_st):
"""
Check nsSSL3Ciphers: default + allowWeakCipher: off
Strong Default ciphers are enabled.
"""
_header(topology_st, 'Test Case 9 - Check default nsSSL3Ciphers (default setting + allowWeakCipher: off)')

topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(ENCRYPTION_DN, [(ldap.MOD_REPLACE, 'nsSSL3Ciphers', b'default'),
(ldap.MOD_REPLACE, 'allowWeakCipher', b'off')])

log.info("\n######################### Restarting the server ######################\n")
topology_st.standalone.stop(timeout=10)
os.system('mv %s %s.48194_7' % (topology_st.standalone.errlog, topology_st.standalone.errlog))
os.system('touch %s' % (topology_st.standalone.errlog))
time.sleep(2)
topology_st.standalone.start(timeout=120)

> connectWithOpenssl(topology_st, 'DES-CBC3-SHA', False)

/export/tests/tickets/ticket48194_test.py:297:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2953a30>
cipher = 'DES-CBC3-SHA', expect = False

def connectWithOpenssl(topology_st, cipher, expect):
"""
Connect with the given cipher
Condition:
If expect is True, the handshake should be successful.
If expect is False, the handshake should be refused with
access log: "Cannot communicate securely with peer:
no common encryption algorithm(s)."
"""
log.info("Testing %s -- expect to handshake %s", cipher, "successfully" if expect else "failed")

myurl = 'localhost:%s' % LDAPSPORT
cmdline = ['/usr/bin/openssl', 's_client', '-connect', myurl, '-cipher', cipher]

strcmdline = " ".join(cmdline)
log.info("Running cmdline: %s", strcmdline)

try:
proc = subprocess.Popen(cmdline, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
except ValueError:
log.info("%s failed: %s", cmdline, ValueError)
proc.kill()

while True:
l = proc.stdout.readline()
if l == b"":
break
if b'Cipher is' in l:
log.info("Found: %s", l)
if expect:
if b'(NONE)' in l:
assert False
else:
proc.stdin.close()
assert True
else:
if b'(NONE)' in l:
assert True
else:
proc.stdin.close()
> assert False
E assert False

/export/tests/tickets/ticket48194_test.py:117: AssertionError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 9 - Check default nsSSL3Ciphers (default setting + allowWeakCipher: off) INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:290 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Failed tickets/ticket48228_test.py::test_ticket48228_test_global_policy 1.40
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2651040>
user = 'uid=user1,dc=example,dc=com', passwd = 'password', times = 6

def update_passwd(topology_st, user, passwd, times):
# Set the default value
cpw = passwd
for i in range(times):
log.info(" Bind as {%s,%s}" % (user, cpw))
topology_st.standalone.simple_bind_s(user, cpw)
# Now update the value for this iter.
cpw = 'password%d' % i
try:
> topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', cpw.encode())])

/export/tests/tickets/ticket48228_test.py:136:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user1,dc=example,dc=com', [(2, 'userpassword', b'password0')])
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x7f61c2d2dc40, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 180,...mbda>', code_context=[' self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b8b186a0, file '/export/tests/tickets/ticket48228_test.py', line 141, code update_pass...t=[" topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', cpw.encode())])\n"], index=0)

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>
dn = 'uid=user1,dc=example,dc=com'
modlist = [(2, 'userpassword', b'password0')]

def modify_s(self,dn,modlist):
> return self.modify_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user1,dc=example,dc=com', [(2, 'userpassword', b'password0')], None, None)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>
dn = 'uid=user1,dc=example,dc=com'
modlist = [(2, 'userpassword', b'password0')], serverctrls = None
clientctrls = None

def modify_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (10,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>, msgid = 10, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (10, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>, msgid = 10, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c24e9bd0>, 10, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>
func = <built-in method result4 of LDAP object at 0x7f61c24e9bd0>
args = (10, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.INSUFFICIENT_ACCESS'>
exc_value = INSUFFICIENT_ACCESS({'msgtype': 103, 'msgid': 10, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user1,dc=example,dc=com'.\n"})
exc_traceback = <traceback object at 0x7f61c267cc40>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c264afd0>
func = <built-in method result4 of LDAP object at 0x7f61c24e9bd0>
args = (10, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 10, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user1,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS

During handling of the above exception, another exception occurred:

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2651040>

def test_ticket48228_test_global_policy(topology_st):
"""
Check global password policy
"""
log.info(' Set inhistory = 6')
set_global_pwpolicy(topology_st, 6)

log.info(' Bind as directory manager')
log.info("Bind as %s" % DN_DM)
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)

log.info(' Add an entry' + USER1_DN)
try:
topology_st.standalone.add_s(
Entry((USER1_DN, {'objectclass': "top person organizationalPerson inetOrgPerson".split(),
'sn': '1',
'cn': 'user 1',
'uid': 'user1',
'givenname': 'user',
'mail': 'user1@example.com',
'userpassword': 'password'})))
except ldap.LDAPError as e:
log.fatal('test_ticket48228: Failed to add user' + USER1_DN + ': error ' + e.message['desc'])
assert False

log.info(' Update the password of ' + USER1_DN + ' 6 times')
> update_passwd(topology_st, USER1_DN, 'password', 6)

/export/tests/tickets/ticket48228_test.py:174:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2651040>
user = 'uid=user1,dc=example,dc=com', passwd = 'password', times = 6

def update_passwd(topology_st, user, passwd, times):
# Set the default value
cpw = passwd
for i in range(times):
log.info(" Bind as {%s,%s}" % (user, cpw))
topology_st.standalone.simple_bind_s(user, cpw)
# Now update the value for this iter.
cpw = 'password%d' % i
try:
topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', cpw.encode())])
except ldap.LDAPError as e:
log.fatal(
> 'test_ticket48228: Failed to update the password ' + cpw + ' of user ' + user + ': error ' + e.message[
'desc'])
E AttributeError: 'INSUFFICIENT_ACCESS' object has no attribute 'message'

/export/tests/tickets/ticket48228_test.py:139: AttributeError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket48234_test.py::test_ticket48234 0.61
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2825ca0>

def test_ticket48234(topology_st):
"""
Test aci which contains an extensible filter.
shutdown
"""

log.info('Bind as root DN')
try:
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
except ldap.LDAPError as e:
topology_st.standalone.log.error('Root DN failed to authenticate: ' + e.args[0]['desc'])
assert False

ouname = 'outest'
username = 'admin'
passwd = 'Password'
deniedattr = 'telephonenumber'
log.info('Add aci which contains extensible filter.')
aci_text = ('(targetattr = "%s")' % (deniedattr) +
'(target = "ldap:///%s")' % (DEFAULT_SUFFIX) +
'(version 3.0;acl "admin-tel-matching-rule-outest";deny (all)' +
'(userdn = "ldap:///%s??sub?(&(cn=%s)(ou:dn:=%s))");)' % (DEFAULT_SUFFIX, username, ouname))

try:
topology_st.standalone.modify_s(DEFAULT_SUFFIX, [(ldap.MOD_ADD, 'aci', ensure_bytes(aci_text))])
except ldap.LDAPError as e:
log.error('Failed to add aci: (%s) error %s' % (aci_text, e.args[0]['desc']))
assert False

log.info('Add entries ...')
for idx in range(0, 2):
ou0 = 'OU%d' % idx
log.info('adding %s under %s...' % (ou0, DEFAULT_SUFFIX))
add_ou_entry(topology_st.standalone, ou0, DEFAULT_SUFFIX)
parent = 'ou=%s,%s' % (ou0, DEFAULT_SUFFIX)
log.info('adding %s under %s...' % (ouname, parent))
add_ou_entry(topology_st.standalone, ouname, parent)

for idx in range(0, 2):
parent = 'ou=%s,ou=OU%d,%s' % (ouname, idx, DEFAULT_SUFFIX)
log.info('adding %s under %s...' % (username, parent))
add_user_entry(topology_st.standalone, username, passwd, parent)

binddn = 'cn=%s,%s' % (username, parent)
log.info('Bind as user %s' % binddn)
try:
topology_st.standalone.simple_bind_s(binddn, passwd)
except ldap.LDAPError as e:
topology_st.standalone.log.error(bindn + ' failed to authenticate: ' + e.args[0]['desc'])
assert False

filter = '(cn=%s)' % username
try:
entries = topology_st.standalone.search_s(DEFAULT_SUFFIX, ldap.SCOPE_SUBTREE, filter, [deniedattr, 'dn'])
> assert 2 == len(entries)
E assert 2 == 0
E +2
E -0

/export/tests/tickets/ticket48234_test.py:83: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48234_test:ticket48234_test.py:35 Bind as root DN INFO  tests.tickets.ticket48234_test:ticket48234_test.py:46 Add aci which contains extensible filter. INFO  tests.tickets.ticket48234_test:ticket48234_test.py:58 Add entries ... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:61 adding OU0 under dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:64 adding outest under ou=OU0,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:61 adding OU1 under dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:64 adding outest under ou=OU1,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:69 adding admin under ou=outest,ou=OU0,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:69 adding admin under ou=outest,ou=OU1,dc=example,dc=com... INFO  tests.tickets.ticket48234_test:ticket48234_test.py:73 Bind as user cn=admin,ou=outest,ou=OU1,dc=example,dc=com
Failed tickets/ticket48266_test.py::test_ticket48266_count_csn_evaluation 0.20
self = <dateutil.parser._parser.parser object at 0x7f61d7f09d90>
timestr = '2020-9-31 00-08-31 -0400'
default = datetime.datetime(2020, 10, 31, 0, 0), ignoretz = False
tzinfos = None, kwargs = {}
res = _result(year=2020, month=9, day=31, hour=0, tzoffset=-14400)
skipped_tokens = None

def parse(self, timestr, default=None,
ignoretz=False, tzinfos=None, **kwargs):
"""
Parse the date/time string into a :class:`datetime.datetime` object.

:param timestr:
Any date/time string using the supported formats.

:param default:
The default datetime object, if this is a datetime object and not
``None``, elements specified in ``timestr`` replace elements in the
default object.

:param ignoretz:
If set ``True``, time zones in parsed strings are ignored and a
naive :class:`datetime.datetime` object is returned.

:param tzinfos:
Additional time zone names / aliases which may be present in the
string. This argument maps time zone names (and optionally offsets
from those time zones) to time zones. This parameter can be a
dictionary with timezone aliases mapping time zone names to time
zones or a function taking two parameters (``tzname`` and
``tzoffset``) and returning a time zone.

The timezones to which the names are mapped can be an integer
offset from UTC in seconds or a :class:`tzinfo` object.

.. doctest::
:options: +NORMALIZE_WHITESPACE

>>> from dateutil.parser import parse
>>> from dateutil.tz import gettz
>>> tzinfos = {"BRST": -7200, "CST": gettz("America/Chicago")}
>>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -7200))
>>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
datetime.datetime(2012, 1, 19, 17, 21,
tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

This parameter is ignored if ``ignoretz`` is set.

:param \\*\\*kwargs:
Keyword arguments as passed to ``_parse()``.

:return:
Returns a :class:`datetime.datetime` object or, if the
``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
first element being a :class:`datetime.datetime` object, the second
a tuple containing the fuzzy tokens.

:raises ParserError:
Raised for invalid or unknown string format, if the provided
:class:`tzinfo` is not in a valid format, or if an invalid date
would be created.

:raises TypeError:
Raised for non-string or character stream input.

:raises OverflowError:
Raised if the parsed date exceeds the largest valid C integer on
your system.
"""

if default is None:
default = datetime.datetime.now().replace(hour=0, minute=0,
second=0, microsecond=0)

res, skipped_tokens = self._parse(timestr, **kwargs)

if res is None:
raise ParserError("Unknown string format: %s", timestr)

if len(res) == 0:
raise ParserError("String does not contain a date: %s", timestr)

try:
> ret = self._build_naive(res, default)

/usr/local/lib/python3.8/site-packages/dateutil/parser/_parser.py:655:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <dateutil.parser._parser.parser object at 0x7f61d7f09d90>
res = _result(year=2020, month=9, day=31, hour=0, tzoffset=-14400)
default = datetime.datetime(2020, 10, 31, 0, 0)

def _build_naive(self, res, default):
repl = {}
for attr in ("year", "month", "day", "hour",
"minute", "second", "microsecond"):
value = getattr(res, attr)
if value is not None:
repl[attr] = value

if 'day' not in repl:
# If the default day exceeds the last day of the month, fall back
# to the end of the month.
cyear = default.year if res.year is None else res.year
cmonth = default.month if res.month is None else res.month
cday = default.day if res.day is None else res.day

if cday > monthrange(cyear, cmonth)[1]:
repl['day'] = monthrange(cyear, cmonth)[1]

> naive = default.replace(**repl)
E ValueError: day is out of range for month

/usr/local/lib/python3.8/site-packages/dateutil/parser/_parser.py:1241: ValueError

The above exception was the direct cause of the following exception:

topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2192790>
entries = None

def test_ticket48266_count_csn_evaluation(topology_m2, entries):
ents = topology_m2.ms["master1"].agreement.list(suffix=SUFFIX)
assert len(ents) == 1
> first_csn = _get_first_not_replicated_csn(topology_m2)

/export/tests/tickets/ticket48266_test.py:176:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket48266_test.py:139: in _get_first_not_replicated_csn
found_op = topology_m2.ms['master1'].ds_access_log.parse_line(found_ops[-1])
/usr/local/lib/python3.8/site-packages/lib389/dirsrv_log.py:293: in parse_line
action['datetime'] = self.parse_timestamp(action['timestamp'])
/usr/local/lib/python3.8/site-packages/lib389/dirsrv_log.py:150: in parse_timestamp
dt = dt_parse(dt_str)
/usr/local/lib/python3.8/site-packages/dateutil/parser/_parser.py:1374: in parse
return DEFAULTPARSER.parse(timestr, **kwargs)
/usr/local/lib/python3.8/site-packages/dateutil/parser/_parser.py:657: in parse
six.raise_from(ParserError(e.args[0] + ": %s", timestr), e)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

value = None, from_value = ValueError('day is out of range for month')

> ???
E dateutil.parser._parser.ParserError: day is out of range for month: 2020-9-31 00-08-31 -0400

<string>:3: ParserError
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48266_test.py:125 dn: cn=new_account2,dc=example,dc=com INFO  tests.tickets.ticket48266_test:ticket48266_test.py:134 ############# cn=new_account2,dc=example,dc=com
Failed tickets/ticket48325_test.py::test_ticket48325 0.03
topology_m1h1c1 = <lib389.topologies.TopologyMain object at 0x7f61c2401a90>

def test_ticket48325(topology_m1h1c1):
"""
Test that the RUV element order is correctly maintained when promoting
a hub or consumer.
"""

#
# Promote consumer to master
#
C1 = topology_m1h1c1.cs["consumer1"]
M1 = topology_m1h1c1.ms["master1"]
H1 = topology_m1h1c1.hs["hub1"]
repl = ReplicationManager(DEFAULT_SUFFIX)
> repl._ensure_changelog(C1)

/export/tests/tickets/ticket48325_test.py:53:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:1928: in _ensure_changelog
cl.create(properties={
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:971: in create
return self._create(rdn, properties, basedn, ensure=False)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:946: in _create
self._instance.add_ext_s(e, serverctrls=self._server_controls, clientctrls=self._client_controls, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176: in inner
return f(ent.dn, ent.toTupleList(), *args[2:])
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425: in add_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c255d8b0>
func = <built-in method result4 of LDAP object at 0x7f61c2401570>
args = (15, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.UNWILLING_TO_PERFORM: {'msgtype': 105, 'msgid': 15, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': [], 'info': 'Changelog configuration is part of the backend configuration'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: UNWILLING_TO_PERFORM
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for hub1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:524 Creating replication topology. INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect ff2eddb1-7bc5-440b-9f34-db54d5bc1b21 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is working INFO  lib389.replica:replica.py:2211 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 89a0a4ed-28ed-407e-8191-7484357ac6ce / got description=ff2eddb1-7bc5-440b-9f34-db54d5bc1b21) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
Failed tickets/ticket48342_test.py::test_ticket4026 93.85
topology_m3 = <lib389.topologies.TopologyMain object at 0x7f61c2274af0>

def test_ticket4026(topology_m3):
"""Write your replication testcase here.

To access each DirSrv instance use: topology_m3.ms["master1"], topology_m3.ms["master2"],
..., topology_m3.hub1, ..., topology_m3.consumer1, ...

Also, if you need any testcase initialization,
please, write additional fixture for that(include finalizer).
"""

try:
topology_m3.ms["master1"].add_s(Entry((PEOPLE_DN, {
'objectclass': "top extensibleObject".split(),
'ou': 'people'})))
except ldap.ALREADY_EXISTS:
pass

topology_m3.ms["master1"].add_s(Entry(('ou=ranges,' + SUFFIX, {
'objectclass': 'top organizationalunit'.split(),
'ou': 'ranges'
})))
for cpt in range(MAX_ACCOUNTS):
name = "user%d" % (cpt)
topology_m3.ms["master1"].add_s(Entry(("uid=%s,%s" % (name, PEOPLE_DN), {
'objectclass': 'top posixAccount extensibleObject'.split(),
'uid': name,
'cn': name,
'uidNumber': '1',
'gidNumber': '1',
'homeDirectory': '/home/%s' % name
})))

# make master3 having more free slots that master2
# so master1 will contact master3
_dna_config(topology_m3.ms["master1"], nextValue=100, maxValue=10)
_dna_config(topology_m3.ms["master2"], nextValue=200, maxValue=10)
_dna_config(topology_m3.ms["master3"], nextValue=300, maxValue=3000)

# Turn on lots of error logging now.

mod = [(ldap.MOD_REPLACE, 'nsslapd-errorlog-level', b'16384')]
# mod = [(ldap.MOD_REPLACE, 'nsslapd-errorlog-level', '1')]
topology_m3.ms["master1"].modify_s('cn=config', mod)
topology_m3.ms["master2"].modify_s('cn=config', mod)
topology_m3.ms["master3"].modify_s('cn=config', mod)

# We need to wait for the event in dna.c to fire to start the servers
# see dna.c line 899
time.sleep(60)

# add on master1 users with description DNA
for cpt in range(10):
name = "user_with_desc1_%d" % (cpt)
topology_m3.ms["master1"].add_s(Entry(("uid=%s,%s" % (name, PEOPLE_DN), {
'objectclass': 'top posixAccount extensibleObject'.split(),
'uid': name,
'cn': name,
'description': '-1',
'uidNumber': '1',
'gidNumber': '1',
'homeDirectory': '/home/%s' % name
})))
# give time to negociate master1 <--> master3
time.sleep(10)
# add on master1 users with description DNA
for cpt in range(11, 20):
name = "user_with_desc1_%d" % (cpt)
> topology_m3.ms["master1"].add_s(Entry(("uid=%s,%s" % (name, PEOPLE_DN), {
'objectclass': 'top posixAccount extensibleObject'.split(),
'uid': name,
'cn': name,
'description': '-1',
'uidNumber': '1',
'gidNumber': '1',
'homeDirectory': '/home/%s' % name
})))

/export/tests/tickets/ticket48342_test.py:118:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176: in inner
return f(ent.dn, ent.toTupleList(), *args[2:])
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:439: in add_s
return self.add_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:178: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425: in add_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c22740d0>
func = <built-in method result4 of LDAP object at 0x7f61d5759270>
args = (15, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.OPERATIONS_ERROR: {'msgtype': 105, 'msgid': 15, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Allocation of a new value for range cn=dna config,cn=distributed numeric assignment plugin,cn=plugins,cn=config failed! Unable to proceed.'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: OPERATIONS_ERROR
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 266c1f5b-b757-42ea-adf4-785f6eff810b / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 550d249d-5cc1-4643-bcb1-e50a69fde336 / got description=266c1f5b-b757-42ea-adf4-785f6eff810b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7a6652a3-a163-4550-9c3e-09085ebd1c53 / got description=58a2af9d-db2a-4014-a014-34df9045f59d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7a6652a3-a163-4550-9c3e-09085ebd1c53 / got description=58a2af9d-db2a-4014-a014-34df9045f59d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7a6652a3-a163-4550-9c3e-09085ebd1c53 / got description=58a2af9d-db2a-4014-a014-34df9045f59d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7a6652a3-a163-4550-9c3e-09085ebd1c53 / got description=58a2af9d-db2a-4014-a014-34df9045f59d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48342_test:ticket48342_test.py:19 Add dna plugin config entry...ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 INFO  tests.tickets.ticket48342_test:ticket48342_test.py:37 Enable the DNA plugin... INFO  tests.tickets.ticket48342_test:ticket48342_test.py:44 Restarting the server... INFO  tests.tickets.ticket48342_test:ticket48342_test.py:19 Add dna plugin config entry...ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  tests.tickets.ticket48342_test:ticket48342_test.py:37 Enable the DNA plugin... INFO  tests.tickets.ticket48342_test:ticket48342_test.py:44 Restarting the server... INFO  tests.tickets.ticket48342_test:ticket48342_test.py:19 Add dna plugin config entry...ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  tests.tickets.ticket48342_test:ticket48342_test.py:37 Enable the DNA plugin... INFO  tests.tickets.ticket48342_test:ticket48342_test.py:44 Restarting the server...
Failed tickets/ticket48637_test.py::test_ticket48637 4.90
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c27fd1f0>

def test_ticket48637(topology_st):
"""Test for entry cache corruption

This requires automember and managed entry plugins to be configured.

Then remove the group that automember would use to trigger a failure when
adding a new entry. Automember fails, and then managed entry also fails.

Make sure a base search on the entry returns error 32
"""

if DEBUGGING:
# Add debugging steps(if any)...
pass

#
# Add our setup entries
#
try:
topology_st.standalone.add_s(Entry((PEOPLE_OU, {
'objectclass': 'top organizationalunit'.split(),
'ou': 'people'})))
except ldap.ALREADY_EXISTS:
pass
except ldap.LDAPError as e:
log.fatal('Failed to add people ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((GROUP_OU, {
'objectclass': 'top organizationalunit'.split(),
'ou': 'groups'})))
except ldap.ALREADY_EXISTS:
pass
except ldap.LDAPError as e:
log.fatal('Failed to add groups ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((MEP_OU, {
'objectclass': 'top extensibleObject'.split(),
'ou': 'mep'})))
except ldap.LDAPError as e:
log.fatal('Failed to add MEP ou: ' + str(e))
assert False

try:
topology_st.standalone.add_s(Entry((MEP_TEMPLATE, {
'objectclass': 'top mepTemplateEntry'.split(),
'cn': 'mep template',
'mepRDNAttr': 'cn',
'mepStaticAttr': 'objectclass: groupofuniquenames',
'mepMappedAttr': 'cn: $uid'})))
except ldap.LDAPError as e:
log.fatal('Failed to add MEP ou: ' + str(e))
assert False

#
# Configure automember
#
try:
topology_st.standalone.add_s(Entry((AUTO_DN, {
'cn': 'All Users',
'objectclass': ['top', 'autoMemberDefinition'],
'autoMemberScope': 'dc=example,dc=com',
'autoMemberFilter': 'objectclass=person',
'autoMemberDefaultGroup': GROUP_DN,
'autoMemberGroupingAttr': 'uniquemember:dn'})))
except ldap.LDAPError as e:
log.fatal('Failed to configure automember plugin : ' + str(e))
assert False

#
# Configure managed entry plugin
#
try:
topology_st.standalone.add_s(Entry((MEP_DN, {
'cn': 'MEP Definition',
'objectclass': ['top', 'extensibleObject'],
'originScope': 'ou=people,dc=example,dc=com',
'originFilter': 'objectclass=person',
'managedBase': 'ou=groups,dc=example,dc=com',
'managedTemplate': MEP_TEMPLATE})))
except ldap.LDAPError as e:
log.fatal('Failed to configure managed entry plugin : ' + str(e))
assert False

#
# Restart DS
#
topology_st.standalone.restart(timeout=30)

#
# Add entry that should fail since the automember group does not exist
#
try:
topology_st.standalone.add_s(Entry((USER_DN, {
'uid': 'test',
'objectclass': ['top', 'person', 'extensibleObject'],
'sn': 'test',
'cn': 'test'})))
except ldap.LDAPError as e:
pass

#
# Search for the entry - it should not be returned
#
try:
entry = topology_st.standalone.search_s(USER_DN, ldap.SCOPE_SUBTREE,
'objectclass=*')
if entry:
log.fatal('Entry was incorrectly returned')
> assert False
E assert False

/export/tests/tickets/ticket48637_test.py:139: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket48637_test:ticket48637_test.py:138 Entry was incorrectly returned
Failed tickets/ticket48784_test.py::test_ticket48784 33.95
Fixture "add_entry" called directly. Fixtures are not meant to be called directly,
but are created automatically when test functions request them as parameters.
See https://docs.pytest.org/en/latest/fixture.html for more information about fixtures, and
https://docs.pytest.org/en/latest/deprecations.html#calling-fixtures-directly about how to update your code.
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d5660df7-4c7a-4503-9690-a337acc1156a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b363c5a2-8588-4890-a511-89800b1d3596 / got description=d5660df7-4c7a-4503-9690-a337acc1156a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48784_test:ticket48784_test.py:90 Ticket 48784 - Allow usage of OpenLDAP libraries that don't use NSS for crypto INFO  tests.tickets.ticket48784_test:ticket48784_test.py:50 ######################### Configure SSL/TLS agreements ###################### INFO  tests.tickets.ticket48784_test:ticket48784_test.py:51 ######################## master1 <-- startTLS -> master2 ##################### INFO  tests.tickets.ticket48784_test:ticket48784_test.py:53 ##### Update the agreement of master1 INFO  tests.tickets.ticket48784_test:ticket48784_test.py:58 ##### Update the agreement of master2 INFO  tests.tickets.ticket48784_test:ticket48784_test.py:68 ######################### Configure SSL/TLS agreements Done ######################
Failed tickets/ticket48798_test.py::test_ticket48798 8.91
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c22b1be0>

def test_ticket48798(topology_st):
"""
Test DH param sizes offered by DS.

"""
topology_st.standalone.enable_tls()

# Confirm that we have a connection, and that it has DH

# Open a socket to the port.
# Check the security settings.
> size = check_socket_dh_param_size(topology_st.standalone.host, topology_st.standalone.sslport)

/export/tests/tickets/ticket48798_test.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket48798_test.py:23: in check_socket_dh_param_size
output = check_output(cmd, shell=True)
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

input = None, capture_output = False, timeout = None, check = True
popenargs = ('echo quit | openssl s_client -connect ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63601 -msg -cipher DH | grep -A 1 ServerKeyExchange',)
kwargs = {'shell': True, 'stdout': -1}
process = <subprocess.Popen object at 0x7f61c22bcac0>, stdout = b''
stderr = None, retcode = 1

def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.

The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them.

If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.

If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.

There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.

By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.

The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE

if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE

with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
E subprocess.CalledProcessError: Command 'echo quit | openssl s_client -connect ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63601 -msg -cipher DH | grep -A 1 ServerKeyExchange' returned non-zero exit status 1.

/usr/lib64/python3.8/subprocess.py:512: CalledProcessError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
depth=1 C = AU, ST = Queensland, L = 389ds, O = testing, CN = ssca.389ds.example.com verify return:1 depth=0 C = AU, ST = Queensland, L = 389ds, O = testing, GN = 407c1ca6-1ff7-4ba7-87c7-979c63638741, CN = ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com verify return:1 DONE
Failed tickets/ticket48808_test.py::test_ticket48808 6.81
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2441370>
create_user = None

def test_ticket48808(topology_st, create_user):
log.info('Run multiple paging controls on a single connection')
users_num = 100
page_size = 30
users_list = add_users(topology_st, users_num)
search_flt = r'(uid=test*)'
searchreq_attrlist = ['dn', 'sn']

log.info('Set user bind')
topology_st.standalone.simple_bind_s(TEST_USER_DN, TEST_USER_PWD)

log.info('Create simple paged results control instance')
req_ctrl = SimplePagedResultsControl(True, size=page_size, cookie='')
controls = [req_ctrl]

for ii in range(3):
log.info('Iteration %d' % ii)
msgid = topology_st.standalone.search_ext(DEFAULT_SUFFIX,
ldap.SCOPE_SUBTREE,
search_flt,
searchreq_attrlist,
serverctrls=controls)
rtype, rdata, rmsgid, rctrls = topology_st.standalone.result3(msgid)
pctrls = [
c
for c in rctrls
if c.controlType == SimplePagedResultsControl.controlType
]

req_ctrl.cookie = pctrls[0].cookie
msgid = topology_st.standalone.search_ext(DEFAULT_SUFFIX,
ldap.SCOPE_SUBTREE,
search_flt,
searchreq_attrlist,
serverctrls=controls)
log.info('Set Directory Manager bind back')
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
del_users(topology_st, users_list)

log.info('Abandon the search')
users_num = 10
page_size = 0
users_list = add_users(topology_st, users_num)
search_flt = r'(uid=test*)'
searchreq_attrlist = ['dn', 'sn']

log.info('Set user bind')
topology_st.standalone.simple_bind_s(TEST_USER_DN, TEST_USER_PWD)

log.info('Create simple paged results control instance')
req_ctrl = SimplePagedResultsControl(True, size=page_size, cookie='')
controls = [req_ctrl]

msgid = topology_st.standalone.search_ext(DEFAULT_SUFFIX,
ldap.SCOPE_SUBTREE,
search_flt,
searchreq_attrlist,
serverctrls=controls)
rtype, rdata, rmsgid, rctrls = topology_st.standalone.result3(msgid)
pctrls = [
c
for c in rctrls
if c.controlType == SimplePagedResultsControl.controlType
]
assert not pctrls[0].cookie

log.info('Set Directory Manager bind back')
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
del_users(topology_st, users_list)

log.info("Search should fail with 'nsPagedSizeLimit = 5'"
"and 'nsslapd-pagedsizelimit = 15' with 10 users")
conf_attr = b'15'
user_attr = b'5'
expected_rs = ldap.SIZELIMIT_EXCEEDED
users_num = 10
page_size = 10
users_list = add_users(topology_st, users_num)
search_flt = r'(uid=test*)'
searchreq_attrlist = ['dn', 'sn']
conf_attr_bck = change_conf_attr(topology_st, DN_CONFIG,
'nsslapd-pagedsizelimit', conf_attr)
user_attr_bck = change_conf_attr(topology_st, TEST_USER_DN,
'nsPagedSizeLimit', user_attr)

log.info('Set user bind')
topology_st.standalone.simple_bind_s(TEST_USER_DN, TEST_USER_PWD)

log.info('Create simple paged results control instance')
req_ctrl = SimplePagedResultsControl(True, size=page_size, cookie='')
controls = [req_ctrl]

log.info('Expect to fail with SIZELIMIT_EXCEEDED')
with pytest.raises(expected_rs):
> all_results = paged_search(topology_st, controls,
search_flt, searchreq_attrlist)
E Failed: DID NOT RAISE <class 'ldap.SIZELIMIT_EXCEEDED'>

/export/tests/tickets/ticket48808_test.py:252: Failed
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48808_test:ticket48808_test.py:159 Run multiple paging controls on a single connection INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 100 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:166 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:169 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 0 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 1 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:174 Iteration 2 INFO  tests.tickets.ticket48808_test:ticket48808_test.py:193 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 100 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:197 Abandon the search INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:204 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:207 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:224 Set Directory Manager bind back INFO  tests.tickets.ticket48808_test:ticket48808_test.py:75 Deleting 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:228 Search should fail with 'nsPagedSizeLimit = 5'and 'nsslapd-pagedsizelimit = 15' with 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:48 Adding 10 users INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsslapd-pagedsizelimit to b'15'. Previous value - [b'0']. Modified suffix - cn=config. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:95 Set nsPagedSizeLimit to b'5'. Previous value - None. Modified suffix - uid=simplepaged_test,dc=example,dc=com. INFO  tests.tickets.ticket48808_test:ticket48808_test.py:243 Set user bind INFO  tests.tickets.ticket48808_test:ticket48808_test.py:246 Create simple paged results control instance INFO  tests.tickets.ticket48808_test:ticket48808_test.py:250 Expect to fail with SIZELIMIT_EXCEEDED INFO  tests.tickets.ticket48808_test:ticket48808_test.py:130 Getting page 0
Failed tickets/ticket48896_test.py::test_ticket48896 0.14
server = <lib389.DirSrv object at 0x7f61c27db070>, curpw = 'password'
newpw = 'Abcd012+', expstr = 'be ok', rc = 0

def replace_pw(server, curpw, newpw, expstr, rc):
log.info('Binding as {%s, %s}' % (TESTDN, curpw))
server.simple_bind_s(TESTDN, curpw)

hit = 0
log.info('Replacing password: %s -> %s, which should %s' % (curpw, newpw, expstr))
try:
> server.modify_s(TESTDN, [(ldap.MOD_REPLACE, 'userPassword', ensure_bytes(newpw))])

/export/tests/tickets/ticket48896_test.py:53:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=buser123,dc=example,dc=com', [(2, 'userPassword', b'Abcd012+')])
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x5576b8f405f0, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 180,...mbda>', code_context=[' self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b8c26f40, file '/export/tests/tickets/ticket48896_test.py', line 57, code replace_pw>,...code_context=[" server.modify_s(TESTDN, [(ldap.MOD_REPLACE, 'userPassword', ensure_bytes(newpw))])\n"], index=0)

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>
dn = 'uid=buser123,dc=example,dc=com'
modlist = [(2, 'userPassword', b'Abcd012+')]

def modify_s(self,dn,modlist):
> return self.modify_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=buser123,dc=example,dc=com', [(2, 'userPassword', b'Abcd012+')], None, None)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>
dn = 'uid=buser123,dc=example,dc=com'
modlist = [(2, 'userPassword', b'Abcd012+')], serverctrls = None
clientctrls = None

def modify_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (8,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>, msgid = 8, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (8, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>, msgid = 8, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c24e84b0>, 8, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>
func = <built-in method result4 of LDAP object at 0x7f61c24e84b0>
args = (8, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.INSUFFICIENT_ACCESS'>
exc_value = INSUFFICIENT_ACCESS({'msgtype': 103, 'msgid': 8, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=buser123,dc=example,dc=com'.\n"})
exc_traceback = <traceback object at 0x7f61c2cb8f00>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c27db070>
func = <built-in method result4 of LDAP object at 0x7f61c24e84b0>
args = (8, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 8, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=buser123,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS

During handling of the above exception, another exception occurred:

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c27db340>

def test_ticket48896(topology_st):
"""
"""
log.info('Testing Ticket 48896 - Default Setting for passwordMinTokenLength does not work')

log.info("Setting global password policy with password syntax.")
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
topology_st.standalone.modify_s(CONFIG_DN, [(ldap.MOD_REPLACE, 'passwordCheckSyntax', b'on'),
(ldap.MOD_REPLACE, 'nsslapd-pwpolicy-local', b'on')])

config = topology_st.standalone.search_s(CONFIG_DN, ldap.SCOPE_BASE, 'cn=*')
mintokenlen = config[0].getValue('passwordMinTokenLength')
history = config[0].getValue('passwordInHistory')

log.info('Default passwordMinTokenLength == %s' % mintokenlen)
log.info('Default passwordInHistory == %s' % history)

log.info('Adding a user.')
curpw = 'password'
topology_st.standalone.add_s(Entry((TESTDN,
{'objectclass': "top person organizationalPerson inetOrgPerson".split(),
'cn': 'test user',
'sn': 'user',
'userPassword': curpw})))

newpw = 'Abcd012+'
exp = 'be ok'
rc = 0
> replace_pw(topology_st.standalone, curpw, newpw, exp, rc)

/export/tests/tickets/ticket48896_test.py:94:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

server = <lib389.DirSrv object at 0x7f61c27db070>, curpw = 'password'
newpw = 'Abcd012+', expstr = 'be ok', rc = 0

def replace_pw(server, curpw, newpw, expstr, rc):
log.info('Binding as {%s, %s}' % (TESTDN, curpw))
server.simple_bind_s(TESTDN, curpw)

hit = 0
log.info('Replacing password: %s -> %s, which should %s' % (curpw, newpw, expstr))
try:
server.modify_s(TESTDN, [(ldap.MOD_REPLACE, 'userPassword', ensure_bytes(newpw))])
except Exception as e:
log.info("Exception (expected): %s" % type(e).__name__)
hit = 1
> assert isinstance(e, rc)
E TypeError: isinstance() arg 2 must be a type or tuple of types

/export/tests/tickets/ticket48896_test.py:57: TypeError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48896_test:ticket48896_test.py:69 Testing Ticket 48896 - Default Setting for passwordMinTokenLength does not work INFO  tests.tickets.ticket48896_test:ticket48896_test.py:71 Setting global password policy with password syntax. INFO  tests.tickets.ticket48896_test:ticket48896_test.py:80 Default passwordMinTokenLength == b'3' INFO  tests.tickets.ticket48896_test:ticket48896_test.py:81 Default passwordInHistory == b'6' INFO  tests.tickets.ticket48896_test:ticket48896_test.py:83 Adding a user. INFO  tests.tickets.ticket48896_test:ticket48896_test.py:47 Binding as {uid=buser123,dc=example,dc=com, password} INFO  tests.tickets.ticket48896_test:ticket48896_test.py:51 Replacing password: password -> Abcd012+, which should be ok INFO  tests.tickets.ticket48896_test:ticket48896_test.py:55 Exception (expected): INSUFFICIENT_ACCESS
Failed tickets/ticket48916_test.py::test_ticket48916 51.30
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c21deca0>

def test_ticket48916(topology_m2):
"""
https://bugzilla.redhat.com/show_bug.cgi?id=1353629

This is an issue with ID exhaustion in DNA causing a crash.

To access each DirSrv instance use: topology_m2.ms["master1"], topology_m2.ms["master2"],
..., topology_m2.hub1, ..., topology_m2.consumer1,...


"""

if DEBUGGING:
# Add debugging steps(if any)...
pass

# Enable the plugin on both servers

dna_m1 = topology_m2.ms["master1"].plugins.get('Distributed Numeric Assignment Plugin')
dna_m2 = topology_m2.ms["master2"].plugins.get('Distributed Numeric Assignment Plugin')

# Configure it
# Create the container for the ranges to go into.

topology_m2.ms["master1"].add_s(Entry(
('ou=Ranges,%s' % DEFAULT_SUFFIX, {
'objectClass': 'top organizationalUnit'.split(' '),
'ou': 'Ranges',
})
))

# Create the dnaAdmin?

# For now we just pinch the dn from the dna_m* types, and add the relevant child config
# but in the future, this could be a better plugin template type from lib389

config_dn = dna_m1.dn

topology_m2.ms["master1"].add_s(Entry(
('cn=uids,%s' % config_dn, {
'objectClass': 'top dnaPluginConfig'.split(' '),
'cn': 'uids',
'dnatype': 'uidNumber gidNumber'.split(' '),
'dnafilter': '(objectclass=posixAccount)',
'dnascope': '%s' % DEFAULT_SUFFIX,
'dnaNextValue': '1',
'dnaMaxValue': '50',
'dnasharedcfgdn': 'ou=Ranges,%s' % DEFAULT_SUFFIX,
'dnaThreshold': '0',
'dnaRangeRequestTimeout': '60',
'dnaMagicRegen': '-1',
'dnaRemoteBindDN': 'uid=dnaAdmin,ou=People,%s' % DEFAULT_SUFFIX,
'dnaRemoteBindCred': 'secret123',
'dnaNextRange': '80-90'
})
))

topology_m2.ms["master2"].add_s(Entry(
('cn=uids,%s' % config_dn, {
'objectClass': 'top dnaPluginConfig'.split(' '),
'cn': 'uids',
'dnatype': 'uidNumber gidNumber'.split(' '),
'dnafilter': '(objectclass=posixAccount)',
'dnascope': '%s' % DEFAULT_SUFFIX,
'dnaNextValue': '61',
'dnaMaxValue': '70',
'dnasharedcfgdn': 'ou=Ranges,%s' % DEFAULT_SUFFIX,
'dnaThreshold': '2',
'dnaRangeRequestTimeout': '60',
'dnaMagicRegen': '-1',
'dnaRemoteBindDN': 'uid=dnaAdmin,ou=People,%s' % DEFAULT_SUFFIX,
'dnaRemoteBindCred': 'secret123',
})
))

# Enable the plugins
dna_m1.enable()
dna_m2.enable()

# Restart the instances
topology_m2.ms["master1"].restart(60)
topology_m2.ms["master2"].restart(60)

# Wait for a replication .....
time.sleep(40)

# Allocate the 10 members to exhaust

for i in range(1, 11):
_create_user(topology_m2.ms["master2"], i)

# Allocate the 11th
> _create_user(topology_m2.ms["master2"], 11)

/export/tests/tickets/ticket48916_test.py:126:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket48916_test.py:21: in _create_user
inst.add_s(Entry(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176: in inner
return f(ent.dn, ent.toTupleList(), *args[2:])
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:439: in add_s
return self.add_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:178: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425: in add_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c21c9220>
func = <built-in method result4 of LDAP object at 0x7f61c2180990>
args = (13, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.OPERATIONS_ERROR: {'msgtype': 105, 'msgid': 13, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Allocation of a new value for range cn=uids,cn=distributed numeric assignment plugin,cn=plugins,cn=config failed! Unable to proceed.'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: OPERATIONS_ERROR
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4c91a235-664a-4b61-a5d3-97d6a7dab280 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0e8149bd-3a06-4e79-9a1b-88cc84959ada / got description=4c91a235-664a-4b61-a5d3-97d6a7dab280) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
Failed tickets/ticket48956_test.py::test_ticket48956 6.54
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2417fa0>

def test_ticket48956(topology_st):
"""Write your testcase here...

Also, if you need any testcase initialization,
please, write additional fixture for that(include finalizer).

"""

topology_st.standalone.modify_s(ACCT_POLICY_PLUGIN_DN,
[(ldap.MOD_REPLACE, 'nsslapd-pluginarg0', ensure_bytes(ACCT_POLICY_CONFIG_DN))])

topology_st.standalone.modify_s(ACCT_POLICY_CONFIG_DN, [(ldap.MOD_REPLACE, 'alwaysrecordlogin', b'yes'),
(ldap.MOD_REPLACE, 'stateattrname', b'lastLoginTime'),
(ldap.MOD_REPLACE, 'altstateattrname', b'createTimestamp'),
(ldap.MOD_REPLACE, 'specattrname', b'acctPolicySubentry'),
(ldap.MOD_REPLACE, 'limitattrname',
b'accountInactivityLimit')])

# Enable the plugins
topology_st.standalone.plugins.enable(name=PLUGIN_ACCT_POLICY)
topology_st.standalone.restart(timeout=10)

# Check inactivity on standard suffix (short)
> _check_inactivity(topology_st, SUFFIX)

/export/tests/tickets/ticket48956_test.py:107:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket48956_test.py:78: in _check_inactivity
assert (_check_status(topology_st, TEST_USER_DN, b'- activated'))
/export/tests/tickets/ticket48956_test.py:39: in _check_status
output = subprocess.check_output([nsaccountstatus, '-Z', topology_st.standalone.serverid,
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
/usr/lib64/python3.8/subprocess.py:489: in run
with Popen(*popenargs, **kwargs) as process:
/usr/lib64/python3.8/subprocess.py:854: in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <subprocess.Popen object at 0x7f61c2457340>
args = ['/usr/sbin/ns-accountstatus.pl', '-Z', 'standalone1', '-D', 'cn=Directory Manager', '-w', ...]
executable = b'/usr/sbin/ns-accountstatus.pl', preexec_fn = None
close_fds = True, pass_fds = (), cwd = None, env = None, startupinfo = None
creationflags = 0, shell = False, p2cread = -1, p2cwrite = -1, c2pread = 44
c2pwrite = 48, errread = -1, errwrite = -1, restore_signals = True
start_new_session = False

def _execute_child(self, args, executable, preexec_fn, close_fds,
pass_fds, cwd, env,
startupinfo, creationflags, shell,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite,
restore_signals, start_new_session):
"""Execute program (POSIX version)"""

if isinstance(args, (str, bytes)):
args = [args]
elif isinstance(args, os.PathLike):
if shell:
raise TypeError('path-like args is not allowed when '
'shell is true')
args = [args]
else:
args = list(args)

if shell:
# On Android the default shell is at '/system/bin/sh'.
unix_shell = ('/system/bin/sh' if
hasattr(sys, 'getandroidapilevel') else '/bin/sh')
args = [unix_shell, "-c"] + args
if executable:
args[0] = executable

if executable is None:
executable = args[0]

sys.audit("subprocess.Popen", executable, args, cwd, env)

if (_USE_POSIX_SPAWN
and os.path.dirname(executable)
and preexec_fn is None
and not close_fds
and not pass_fds
and cwd is None
and (p2cread == -1 or p2cread > 2)
and (c2pwrite == -1 or c2pwrite > 2)
and (errwrite == -1 or errwrite > 2)
and not start_new_session):
self._posix_spawn(args, executable, env, restore_signals,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)
return

orig_executable = executable

# For transferring possible exec failure from child to parent.
# Data format: "exception name:hex errno:description"
# Pickle is not used; it is complex and involves memory allocation.
errpipe_read, errpipe_write = os.pipe()
# errpipe_write must not be in the standard io 0, 1, or 2 fd range.
low_fds_to_close = []
while errpipe_write < 3:
low_fds_to_close.append(errpipe_write)
errpipe_write = os.dup(errpipe_write)
for low_fd in low_fds_to_close:
os.close(low_fd)
try:
try:
# We must avoid complex work that could involve
# malloc or free in the child process to avoid
# potential deadlocks, thus we do all this here.
# and pass it to fork_exec()

if env is not None:
env_list = []
for k, v in env.items():
k = os.fsencode(k)
if b'=' in k:
raise ValueError("illegal environment variable name")
env_list.append(k + b'=' + os.fsencode(v))
else:
env_list = None # Use execv instead of execve.
executable = os.fsencode(executable)
if os.path.dirname(executable):
executable_list = (executable,)
else:
# This matches the behavior of os._execvpe().
executable_list = tuple(
os.path.join(os.fsencode(dir), executable)
for dir in os.get_exec_path(env))
fds_to_keep = set(pass_fds)
fds_to_keep.add(errpipe_write)
self.pid = _posixsubprocess.fork_exec(
args, executable_list,
close_fds, tuple(sorted(map(int, fds_to_keep))),
cwd, env_list,
p2cread, p2cwrite, c2pread, c2pwrite,
errread, errwrite,
errpipe_read, errpipe_write,
restore_signals, start_new_session, preexec_fn)
self._child_created = True
finally:
# be sure the FD is closed no matter what
os.close(errpipe_write)

self._close_pipe_fds(p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)

# Wait for exec to fail or succeed; possibly raising an
# exception (limited in size)
errpipe_data = bytearray()
while True:
part = os.read(errpipe_read, 50000)
errpipe_data += part
if not part or len(errpipe_data) > 50000:
break
finally:
# be sure the FD is closed no matter what
os.close(errpipe_read)

if errpipe_data:
try:
pid, sts = os.waitpid(self.pid, 0)
if pid == self.pid:
self._handle_exitstatus(sts)
else:
self.returncode = sys.maxsize
except ChildProcessError:
pass

try:
exception_name, hex_errno, err_msg = (
errpipe_data.split(b':', 2))
# The encoding here should match the encoding
# written in by the subprocess implementations
# like _posixsubprocess
err_msg = err_msg.decode()
except ValueError:
exception_name = b'SubprocessError'
hex_errno = b'0'
err_msg = 'Bad exception data from child: {!r}'.format(
bytes(errpipe_data))
child_exception_type = getattr(
builtins, exception_name.decode('ascii'),
SubprocessError)
if issubclass(child_exception_type, OSError) and hex_errno:
errno_num = int(hex_errno, 16)
child_exec_never_called = (err_msg == "noexec")
if child_exec_never_called:
err_msg = ""
# The error must be from chdir(cwd).
err_filename = cwd
else:
err_filename = orig_executable
if errno_num != 0:
err_msg = os.strerror(errno_num)
> raise child_exception_type(errno_num, err_msg, err_filename)
E FileNotFoundError: [Errno 2] No such file or directory: '/usr/sbin/ns-accountstatus.pl'

/usr/lib64/python3.8/subprocess.py:1702: FileNotFoundError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48956_test:ticket48956_test.py:54 ######################### Adding Account Policy entry: cn=Account Inactivation Policy,dc=example,dc=com ###################### INFO  tests.tickets.ticket48956_test:ticket48956_test.py:61 ######################### Adding Test User entry: uid=ticket48956user,dc=example,dc=com ######################
Failed tickets/ticket48961_test.py::test_ticket48961_storagescheme 0.02
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c213ebe0>

def test_ticket48961_storagescheme(topology_st):
"""
Test deleting of the storage scheme.
"""

default = topology_st.standalone.config.get_attr_val('passwordStorageScheme')
# Change it
topology_st.standalone.config.set('passwordStorageScheme', 'CLEAR')
# Now delete it
> topology_st.standalone.config.remove('passwordStorageScheme', None)

/export/tests/tickets/ticket48961_test.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:316: in remove
self.set(key, value, action=ldap.MOD_DELETE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c213eb80>
func = <built-in method result4 of LDAP object at 0x7f61c245fbd0>
args = (5, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.OPERATIONS_ERROR: {'msgtype': 103, 'msgid': 5, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'passwordStorageScheme: deleting the value is not allowed.'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: OPERATIONS_ERROR
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket48961_test.py::test_ticket48961_deleteall 0.00
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c213ebe0>

def test_ticket48961_deleteall(topology_st):
"""
Test that we can delete all valid attrs, and that a few are rejected.
"""
attr_to_test = {
'nsslapd-listenhost': 'localhost',
'nsslapd-securelistenhost': 'localhost',
'nsslapd-allowed-sasl-mechanisms': 'GSSAPI',
'nsslapd-svrtab': 'Some bogus data', # This one could reset?
}
attr_to_fail = {
# These are the values that should always be dn dse.ldif too
'nsslapd-localuser': 'dirsrv',
'nsslapd-defaultnamingcontext': 'dc=example,dc=com', # Can't delete
'nsslapd-accesslog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/access',
'nsslapd-auditlog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/audit',
'nsslapd-errorlog': '/opt/dirsrv/var/log/dirsrv/slapd-standalone/errors',
'nsslapd-tmpdir': '/tmp',
'nsslapd-rundir': '/opt/dirsrv/var/run/dirsrv',
'nsslapd-bakdir': '/opt/dirsrv/var/lib/dirsrv/slapd-standalone/bak',
'nsslapd-certdir': '/opt/dirsrv/etc/dirsrv/slapd-standalone',
'nsslapd-instancedir': '/opt/dirsrv/lib/dirsrv/slapd-standalone',
'nsslapd-ldifdir': '/opt/dirsrv/var/lib/dirsrv/slapd-standalone/ldif',
'nsslapd-lockdir': '/opt/dirsrv/var/lock/dirsrv/slapd-standalone',
'nsslapd-schemadir': '/opt/dirsrv/etc/dirsrv/slapd-standalone/schema',
'nsslapd-workingdir': '/opt/dirsrv/var/log/dirsrv/slapd-standalone',
'nsslapd-localhost': 'localhost.localdomain',
# These can't be reset, but might be in dse.ldif. Probably in libglobs.
'nsslapd-certmap-basedn': 'cn=certmap,cn=config',
'nsslapd-port': '38931', # Can't delete
'nsslapd-secureport': '636', # Can't delete
'nsslapd-conntablesize': '1048576',
'nsslapd-rootpw': '{SSHA512}...',
# These are hardcoded server magic.
'nsslapd-hash-filters': 'off', # Can't delete
'nsslapd-requiresrestart': 'cn=config:nsslapd-port', # Can't change
'nsslapd-plugin': 'cn=case ignore string syntax,cn=plugins,cn=config', # Can't change
'nsslapd-privatenamespaces': 'cn=schema', # Can't change
'nsslapd-allowed-to-delete-attrs': 'None', # Can't delete
'nsslapd-accesslog-list': 'List!', # Can't delete
'nsslapd-auditfaillog-list': 'List!',
'nsslapd-auditlog-list': 'List!',
'nsslapd-errorlog-list': 'List!',
'nsslapd-config': 'cn=config',
'nsslapd-versionstring': '389-Directory/1.3.6.0',
'objectclass': '',
'cn': '',
# These are the odd values
'nsslapd-backendconfig': 'cn=config,cn=userRoot,cn=ldbm database,cn=plugins,cn=config', # Doesn't exist?
'nsslapd-betype': 'ldbm database', # Doesn't exist?
'nsslapd-connection-buffer': 1, # Has an ldap problem
'nsslapd-malloc-mmap-threshold': '-10', # Defunct anyway
'nsslapd-malloc-mxfast': '-10',
'nsslapd-malloc-trim-threshold': '-10',
'nsslapd-referralmode': '',
'nsslapd-saslpath': '',
'passwordadmindn': '',
}

> config_entry = topology_st.standalone.config.raw_entry()

/export/tests/tickets/ticket48961_test.py:101:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.config.Config object at 0x7f61c213aeb0>, name = 'raw_entry'

def __getattr__(self, name):
"""This enables a bit of magic to allow us to wrap any function ending with
_json to it's form without json, then transformed. It means your function
*must* return it's values as a dict of:

{ attr : [val, val, ...], attr : [], ... }
to be supported.
"""

if (name.endswith('_json')):
int_name = name.replace('_json', '')
pfunc = partial(self._jsonify, getattr(self, int_name))
return pfunc
else:
> raise AttributeError("'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
E AttributeError: 'Config' object has no attribute 'raw_entry'

/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:199: AttributeError
Failed tickets/ticket49039_test.py::test_ticket49039 12.33
topo = <lib389.topologies.TopologyMain object at 0x7f61c20f9070>

def test_ticket49039(topo):
"""Test "password must change" verses "password min age". Min age should not
block password update if the password was reset.
"""

# Setup SSL (for ldappasswd test)
topo.standalone.enable_tls()

# Configure password policy
try:
policy = PwPolicyManager(topo.standalone)
policy.set_global_policy(properties={'nsslapd-pwpolicy-local': 'on',
'passwordMustChange': 'on',
'passwordExp': 'on',
'passwordMaxAge': '86400000',
'passwordMinAge': '8640000',
'passwordChange': 'on'})
except ldap.LDAPError as e:
log.fatal('Failed to set password policy: ' + str(e))

# Add user, bind, and set password
try:
topo.standalone.add_s(Entry((USER_DN, {
'objectclass': 'top extensibleObject'.split(),
'uid': 'user1',
'userpassword': PASSWORD
})))
except ldap.LDAPError as e:
log.fatal('Failed to add user: error ' + e.args[0]['desc'])
assert False

# Reset password as RootDN
try:
topo.standalone.modify_s(USER_DN, [(ldap.MOD_REPLACE, 'userpassword', ensure_bytes(PASSWORD))])
except ldap.LDAPError as e:
log.fatal('Failed to bind: error ' + e.args[0]['desc'])
assert False

time.sleep(1)

# Reset password as user
try:
topo.standalone.simple_bind_s(USER_DN, PASSWORD)
except ldap.LDAPError as e:
log.fatal('Failed to bind: error ' + e.args[0]['desc'])
assert False

try:
> topo.standalone.modify_s(USER_DN, [(ldap.MOD_REPLACE, 'userpassword', ensure_bytes(PASSWORD))])

/export/tests/tickets/ticket49039_test.py:75:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user,dc=example,dc=com', [(2, 'userpassword', b'password')])
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x7f61c30cb640, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 180,...93, function='_hookexec', code_context=[' return self._inner_hookexec(hook, methods, kwargs)\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b8f272d0, file '/export/tests/tickets/ticket49039_test.py', line 78, code test_ticket4...[" topo.standalone.modify_s(USER_DN, [(ldap.MOD_REPLACE, 'userpassword', ensure_bytes(PASSWORD))])\n"], index=0)

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>
dn = 'uid=user,dc=example,dc=com', modlist = [(2, 'userpassword', b'password')]

def modify_s(self,dn,modlist):
> return self.modify_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user,dc=example,dc=com', [(2, 'userpassword', b'password')], None, None)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>
dn = 'uid=user,dc=example,dc=com', modlist = [(2, 'userpassword', b'password')]
serverctrls = None, clientctrls = None

def modify_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (7,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>, msgid = 7, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (7, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>, msgid = 7, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c20f9120>, 7, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>
func = <built-in method result4 of LDAP object at 0x7f61c20f9120>
args = (7, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.INSUFFICIENT_ACCESS'>
exc_value = INSUFFICIENT_ACCESS({'msgtype': 103, 'msgid': 7, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user,dc=example,dc=com'.\n"})
exc_traceback = <traceback object at 0x7f61c2723180>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c20f3fd0>
func = <built-in method result4 of LDAP object at 0x7f61c20f9120>
args = (7, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 7, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS

During handling of the above exception, another exception occurred:

topo = <lib389.topologies.TopologyMain object at 0x7f61c20f9070>

def test_ticket49039(topo):
"""Test "password must change" verses "password min age". Min age should not
block password update if the password was reset.
"""

# Setup SSL (for ldappasswd test)
topo.standalone.enable_tls()

# Configure password policy
try:
policy = PwPolicyManager(topo.standalone)
policy.set_global_policy(properties={'nsslapd-pwpolicy-local': 'on',
'passwordMustChange': 'on',
'passwordExp': 'on',
'passwordMaxAge': '86400000',
'passwordMinAge': '8640000',
'passwordChange': 'on'})
except ldap.LDAPError as e:
log.fatal('Failed to set password policy: ' + str(e))

# Add user, bind, and set password
try:
topo.standalone.add_s(Entry((USER_DN, {
'objectclass': 'top extensibleObject'.split(),
'uid': 'user1',
'userpassword': PASSWORD
})))
except ldap.LDAPError as e:
log.fatal('Failed to add user: error ' + e.args[0]['desc'])
assert False

# Reset password as RootDN
try:
topo.standalone.modify_s(USER_DN, [(ldap.MOD_REPLACE, 'userpassword', ensure_bytes(PASSWORD))])
except ldap.LDAPError as e:
log.fatal('Failed to bind: error ' + e.args[0]['desc'])
assert False

time.sleep(1)

# Reset password as user
try:
topo.standalone.simple_bind_s(USER_DN, PASSWORD)
except ldap.LDAPError as e:
log.fatal('Failed to bind: error ' + e.args[0]['desc'])
assert False

try:
topo.standalone.modify_s(USER_DN, [(ldap.MOD_REPLACE, 'userpassword', ensure_bytes(PASSWORD))])
except ldap.LDAPError as e:
log.fatal('Failed to change password: error ' + e.args[0]['desc'])
> assert False
E assert False

/export/tests/tickets/ticket49039_test.py:78: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49039_test:ticket49039_test.py:77 Failed to change password: error Insufficient access
Failed tickets/ticket49072_test.py::test_ticket49072_basedn 4.67
topo = <lib389.topologies.TopologyMain object at 0x7f61c1fb1ac0>

def test_ticket49072_basedn(topo):
"""memberOf fixup task does not validate args

:id: dce9b898-119d-42b8-a236-1130e59bfe18
:feature: memberOf
:setup: Standalone instance, with memberOf plugin
:steps: 1. Run fixup-memberOf.pl with invalid DN entry
2. Check if error log reports "Failed to get be backend"
:expectedresults: Fixup-memberOf.pl task should complete, but errors logged.
"""

log.info("Ticket 49072 memberof fixup task with invalid basedn...")
topo.standalone.plugins.enable(name=PLUGIN_MEMBER_OF)
topo.standalone.restart(timeout=10)

if ds_is_older('1.3'):
inst_dir = topo.standalone.get_inst_dir()
memof_task = os.path.join(inst_dir, FIXUP_MEMOF)
try:
output = subprocess.check_output([memof_task, '-D', DN_DM, '-w', PASSWORD, '-b', TEST_BASEDN, '-f', FILTER])
except subprocess.CalledProcessError as err:
output = err.output
else:
sbin_dir = topo.standalone.get_sbin_dir()
memof_task = os.path.join(sbin_dir, FIXUP_MEMOF)
try:
> output = subprocess.check_output(
[memof_task, '-D', DN_DM, '-w', PASSWORD, '-b', TEST_BASEDN, '-Z', SERVERID_STANDALONE, '-f', FILTER])

/export/tests/tickets/ticket49072_test.py:55:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
/usr/lib64/python3.8/subprocess.py:489: in run
with Popen(*popenargs, **kwargs) as process:
/usr/lib64/python3.8/subprocess.py:854: in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <subprocess.Popen object at 0x7f61c1f84cd0>
args = ['/usr/sbin/fixup-memberof.pl', '-D', 'cn=Directory Manager', '-w', 'password', '-b', ...]
executable = b'/usr/sbin/fixup-memberof.pl', preexec_fn = None, close_fds = True
pass_fds = (), cwd = None, env = None, startupinfo = None, creationflags = 0
shell = False, p2cread = -1, p2cwrite = -1, c2pread = 51, c2pwrite = 52
errread = -1, errwrite = -1, restore_signals = True, start_new_session = False

def _execute_child(self, args, executable, preexec_fn, close_fds,
pass_fds, cwd, env,
startupinfo, creationflags, shell,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite,
restore_signals, start_new_session):
"""Execute program (POSIX version)"""

if isinstance(args, (str, bytes)):
args = [args]
elif isinstance(args, os.PathLike):
if shell:
raise TypeError('path-like args is not allowed when '
'shell is true')
args = [args]
else:
args = list(args)

if shell:
# On Android the default shell is at '/system/bin/sh'.
unix_shell = ('/system/bin/sh' if
hasattr(sys, 'getandroidapilevel') else '/bin/sh')
args = [unix_shell, "-c"] + args
if executable:
args[0] = executable

if executable is None:
executable = args[0]

sys.audit("subprocess.Popen", executable, args, cwd, env)

if (_USE_POSIX_SPAWN
and os.path.dirname(executable)
and preexec_fn is None
and not close_fds
and not pass_fds
and cwd is None
and (p2cread == -1 or p2cread > 2)
and (c2pwrite == -1 or c2pwrite > 2)
and (errwrite == -1 or errwrite > 2)
and not start_new_session):
self._posix_spawn(args, executable, env, restore_signals,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)
return

orig_executable = executable

# For transferring possible exec failure from child to parent.
# Data format: "exception name:hex errno:description"
# Pickle is not used; it is complex and involves memory allocation.
errpipe_read, errpipe_write = os.pipe()
# errpipe_write must not be in the standard io 0, 1, or 2 fd range.
low_fds_to_close = []
while errpipe_write < 3:
low_fds_to_close.append(errpipe_write)
errpipe_write = os.dup(errpipe_write)
for low_fd in low_fds_to_close:
os.close(low_fd)
try:
try:
# We must avoid complex work that could involve
# malloc or free in the child process to avoid
# potential deadlocks, thus we do all this here.
# and pass it to fork_exec()

if env is not None:
env_list = []
for k, v in env.items():
k = os.fsencode(k)
if b'=' in k:
raise ValueError("illegal environment variable name")
env_list.append(k + b'=' + os.fsencode(v))
else:
env_list = None # Use execv instead of execve.
executable = os.fsencode(executable)
if os.path.dirname(executable):
executable_list = (executable,)
else:
# This matches the behavior of os._execvpe().
executable_list = tuple(
os.path.join(os.fsencode(dir), executable)
for dir in os.get_exec_path(env))
fds_to_keep = set(pass_fds)
fds_to_keep.add(errpipe_write)
self.pid = _posixsubprocess.fork_exec(
args, executable_list,
close_fds, tuple(sorted(map(int, fds_to_keep))),
cwd, env_list,
p2cread, p2cwrite, c2pread, c2pwrite,
errread, errwrite,
errpipe_read, errpipe_write,
restore_signals, start_new_session, preexec_fn)
self._child_created = True
finally:
# be sure the FD is closed no matter what
os.close(errpipe_write)

self._close_pipe_fds(p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)

# Wait for exec to fail or succeed; possibly raising an
# exception (limited in size)
errpipe_data = bytearray()
while True:
part = os.read(errpipe_read, 50000)
errpipe_data += part
if not part or len(errpipe_data) > 50000:
break
finally:
# be sure the FD is closed no matter what
os.close(errpipe_read)

if errpipe_data:
try:
pid, sts = os.waitpid(self.pid, 0)
if pid == self.pid:
self._handle_exitstatus(sts)
else:
self.returncode = sys.maxsize
except ChildProcessError:
pass

try:
exception_name, hex_errno, err_msg = (
errpipe_data.split(b':', 2))
# The encoding here should match the encoding
# written in by the subprocess implementations
# like _posixsubprocess
err_msg = err_msg.decode()
except ValueError:
exception_name = b'SubprocessError'
hex_errno = b'0'
err_msg = 'Bad exception data from child: {!r}'.format(
bytes(errpipe_data))
child_exception_type = getattr(
builtins, exception_name.decode('ascii'),
SubprocessError)
if issubclass(child_exception_type, OSError) and hex_errno:
errno_num = int(hex_errno, 16)
child_exec_never_called = (err_msg == "noexec")
if child_exec_never_called:
err_msg = ""
# The error must be from chdir(cwd).
err_filename = cwd
else:
err_filename = orig_executable
if errno_num != 0:
err_msg = os.strerror(errno_num)
> raise child_exception_type(errno_num, err_msg, err_filename)
E FileNotFoundError: [Errno 2] No such file or directory: '/usr/sbin/fixup-memberof.pl'

/usr/lib64/python3.8/subprocess.py:1702: FileNotFoundError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49072_test:ticket49072_test.py:40 Ticket 49072 memberof fixup task with invalid basedn...
Failed tickets/ticket49072_test.py::test_ticket49072_filter 10.04
topo = <lib389.topologies.TopologyMain object at 0x7f61c1fb1ac0>

def test_ticket49072_filter(topo):
"""memberOf fixup task does not validate args

:id: dde9e893-119d-42c8-a236-1190e56bfe98
:feature: memberOf
:setup: Standalone instance, with memberOf plugin
:steps: 1. Run fixup-memberOf.pl with invalid filter
2. Check if error log reports "Bad search filter"
:expectedresults: Fixup-memberOf.pl task should complete, but errors logged.
"""
log.info("Ticket 49072 memberof fixup task with invalid filter...")
log.info('Wait for 10 secs and check if task is completed')
time.sleep(10)
task_memof = 'cn=memberOf task,cn=tasks,cn=config'
if topo.standalone.search_s(task_memof, ldap.SCOPE_SUBTREE, 'cn=memberOf_fixup*', ['dn:']):
log.info('memberof task is still running, wait for +10 secs')
time.sleep(10)

if ds_is_older('1.3'):
inst_dir = topo.standalone.get_inst_dir()
memof_task = os.path.join(inst_dir, FIXUP_MEMOF)
try:
output = subprocess.check_output([memof_task, '-D', DN_DM, '-w', PASSWORD, '-b', SUFFIX, '-f', TEST_FILTER])
except subprocess.CalledProcessError as err:
output = err.output
else:
sbin_dir = topo.standalone.get_sbin_dir()
memof_task = os.path.join(sbin_dir, FIXUP_MEMOF)
try:
> output = subprocess.check_output(
[memof_task, '-D', DN_DM, '-w', PASSWORD, '-b', SUFFIX, '-Z', SERVERID_STANDALONE, '-f', TEST_FILTER])

/export/tests/tickets/ticket49072_test.py:96:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib64/python3.8/subprocess.py:411: in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
/usr/lib64/python3.8/subprocess.py:489: in run
with Popen(*popenargs, **kwargs) as process:
/usr/lib64/python3.8/subprocess.py:854: in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <subprocess.Popen object at 0x7f61c1fb1850>
args = ['/usr/sbin/fixup-memberof.pl', '-D', 'cn=Directory Manager', '-w', 'password', '-b', ...]
executable = b'/usr/sbin/fixup-memberof.pl', preexec_fn = None, close_fds = True
pass_fds = (), cwd = None, env = None, startupinfo = None, creationflags = 0
shell = False, p2cread = -1, p2cwrite = -1, c2pread = 48, c2pwrite = 51
errread = -1, errwrite = -1, restore_signals = True, start_new_session = False

def _execute_child(self, args, executable, preexec_fn, close_fds,
pass_fds, cwd, env,
startupinfo, creationflags, shell,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite,
restore_signals, start_new_session):
"""Execute program (POSIX version)"""

if isinstance(args, (str, bytes)):
args = [args]
elif isinstance(args, os.PathLike):
if shell:
raise TypeError('path-like args is not allowed when '
'shell is true')
args = [args]
else:
args = list(args)

if shell:
# On Android the default shell is at '/system/bin/sh'.
unix_shell = ('/system/bin/sh' if
hasattr(sys, 'getandroidapilevel') else '/bin/sh')
args = [unix_shell, "-c"] + args
if executable:
args[0] = executable

if executable is None:
executable = args[0]

sys.audit("subprocess.Popen", executable, args, cwd, env)

if (_USE_POSIX_SPAWN
and os.path.dirname(executable)
and preexec_fn is None
and not close_fds
and not pass_fds
and cwd is None
and (p2cread == -1 or p2cread > 2)
and (c2pwrite == -1 or c2pwrite > 2)
and (errwrite == -1 or errwrite > 2)
and not start_new_session):
self._posix_spawn(args, executable, env, restore_signals,
p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)
return

orig_executable = executable

# For transferring possible exec failure from child to parent.
# Data format: "exception name:hex errno:description"
# Pickle is not used; it is complex and involves memory allocation.
errpipe_read, errpipe_write = os.pipe()
# errpipe_write must not be in the standard io 0, 1, or 2 fd range.
low_fds_to_close = []
while errpipe_write < 3:
low_fds_to_close.append(errpipe_write)
errpipe_write = os.dup(errpipe_write)
for low_fd in low_fds_to_close:
os.close(low_fd)
try:
try:
# We must avoid complex work that could involve
# malloc or free in the child process to avoid
# potential deadlocks, thus we do all this here.
# and pass it to fork_exec()

if env is not None:
env_list = []
for k, v in env.items():
k = os.fsencode(k)
if b'=' in k:
raise ValueError("illegal environment variable name")
env_list.append(k + b'=' + os.fsencode(v))
else:
env_list = None # Use execv instead of execve.
executable = os.fsencode(executable)
if os.path.dirname(executable):
executable_list = (executable,)
else:
# This matches the behavior of os._execvpe().
executable_list = tuple(
os.path.join(os.fsencode(dir), executable)
for dir in os.get_exec_path(env))
fds_to_keep = set(pass_fds)
fds_to_keep.add(errpipe_write)
self.pid = _posixsubprocess.fork_exec(
args, executable_list,
close_fds, tuple(sorted(map(int, fds_to_keep))),
cwd, env_list,
p2cread, p2cwrite, c2pread, c2pwrite,
errread, errwrite,
errpipe_read, errpipe_write,
restore_signals, start_new_session, preexec_fn)
self._child_created = True
finally:
# be sure the FD is closed no matter what
os.close(errpipe_write)

self._close_pipe_fds(p2cread, p2cwrite,
c2pread, c2pwrite,
errread, errwrite)

# Wait for exec to fail or succeed; possibly raising an
# exception (limited in size)
errpipe_data = bytearray()
while True:
part = os.read(errpipe_read, 50000)
errpipe_data += part
if not part or len(errpipe_data) > 50000:
break
finally:
# be sure the FD is closed no matter what
os.close(errpipe_read)

if errpipe_data:
try:
pid, sts = os.waitpid(self.pid, 0)
if pid == self.pid:
self._handle_exitstatus(sts)
else:
self.returncode = sys.maxsize
except ChildProcessError:
pass

try:
exception_name, hex_errno, err_msg = (
errpipe_data.split(b':', 2))
# The encoding here should match the encoding
# written in by the subprocess implementations
# like _posixsubprocess
err_msg = err_msg.decode()
except ValueError:
exception_name = b'SubprocessError'
hex_errno = b'0'
err_msg = 'Bad exception data from child: {!r}'.format(
bytes(errpipe_data))
child_exception_type = getattr(
builtins, exception_name.decode('ascii'),
SubprocessError)
if issubclass(child_exception_type, OSError) and hex_errno:
errno_num = int(hex_errno, 16)
child_exec_never_called = (err_msg == "noexec")
if child_exec_never_called:
err_msg = ""
# The error must be from chdir(cwd).
err_filename = cwd
else:
err_filename = orig_executable
if errno_num != 0:
err_msg = os.strerror(errno_num)
> raise child_exception_type(errno_num, err_msg, err_filename)
E FileNotFoundError: [Errno 2] No such file or directory: '/usr/sbin/fixup-memberof.pl'

/usr/lib64/python3.8/subprocess.py:1702: FileNotFoundError
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49072_test:ticket49072_test.py:77 Ticket 49072 memberof fixup task with invalid filter... INFO  tests.tickets.ticket49072_test:ticket49072_test.py:78 Wait for 10 secs and check if task is completed
Failed tickets/ticket49073_test.py::test_ticket49073 8.35
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c1e03430>

def test_ticket49073(topology_m2):
"""Write your replication test here.

To access each DirSrv instance use: topology_m2.ms["master1"], topology_m2.ms["master2"],
..., topology_m2.hub1, ..., topology_m2.consumer1,...

Also, if you need any testcase initialization,
please, write additional fixture for that(include finalizer).
"""
topology_m2.ms["master1"].plugins.enable(name=PLUGIN_MEMBER_OF)
topology_m2.ms["master1"].restart(timeout=10)
topology_m2.ms["master2"].plugins.enable(name=PLUGIN_MEMBER_OF)
topology_m2.ms["master2"].restart(timeout=10)

# Configure fractional to prevent total init to send memberof
ents = topology_m2.ms["master1"].agreement.list(suffix=SUFFIX)
assert len(ents) == 1
log.info('update %s to add nsDS5ReplicatedAttributeListTotal' % ents[0].dn)
> topology_m2.ms["master1"].modify_s(ents[0].dn,
[(ldap.MOD_REPLACE,
'nsDS5ReplicatedAttributeListTotal',
'(objectclass=*) $ EXCLUDE '),
(ldap.MOD_REPLACE,
'nsDS5ReplicatedAttributeList',
'(objectclass=*) $ EXCLUDE memberOf')])

/export/tests/tickets/ticket49073_test.py:97:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640: in modify_s
return self.modify_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:612: in modify_ext_s
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:609: in modify_ext
return self._ldap_call(self._l.modify_ext,dn,modlist,RequestControlTuples(serverctrls),RequestControlTuples(clientctrls))
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1e033d0>
func = <built-in method modify_ext of LDAP object at 0x7f61c1fa4570>
args = ('cn=002,cn=replica,cn=dc\\3Dexample\\2Cdc\\3Dcom,cn=mapping tree,cn=config', [(2, 'nsDS5ReplicatedAttributeListTotal', '(objectclass=*) $ EXCLUDE '), (2, 'nsDS5ReplicatedAttributeList', '(objectclass=*) $ EXCLUDE memberOf')], None, None)
kwargs = {}, diagnostic_message_success = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E TypeError: ('Tuple_to_LDAPMod(): expected a byte string in the list', '(')

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: TypeError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 16530668-4b00-4173-a9bc-d746ced92fb5 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect c0efb6ec-e970-442a-a48a-baa60107173f / got description=16530668-4b00-4173-a9bc-d746ced92fb5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49073_test:ticket49073_test.py:96 update cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal
Failed tickets/ticket49104_test.py::test_ticket49104_setup 0.01
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c2145250>

def test_ticket49104_setup(topology_st):
"""
Generate an ldif file having 10K entries and import it.
"""
# Generate a test ldif (100k entries)
ldif_dir = topology_st.standalone.get_ldif_dir()
import_ldif = ldif_dir + '/49104.ldif'
try:
> topology_st.standalone.buildLDIF(100000, import_ldif)

/export/tests/tickets/ticket49104_test.py:30:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2145670>, num = 100000
ldif_file = '/var/lib/dirsrv/slapd-standalone1/ldif/49104.ldif'
suffix = 'dc=example,dc=com'

def buildLDIF(self, num, ldif_file, suffix='dc=example,dc=com'):
"""Generate a simple ldif file using the dbgen.pl script, and set the
ownership and permissions to match the user that the server runs as.

@param num - number of entries to create
@param ldif_file - ldif file name(including the path)
@suffix - DN of the parent entry in the ldif file
@return - nothing
@raise - OSError
"""
> raise Exception("Perl tools disabled on this system. Try dbgen py module.")
E Exception: Perl tools disabled on this system. Try dbgen py module.

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:3236: Exception
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket49192_test.py::test_ticket49192 0.00
topo = <lib389.topologies.TopologyMain object at 0x7f61c1fecd30>

def test_ticket49192(topo):
"""Trigger deadlock when removing suffix
"""

#
# Create a second suffix/backend
#
log.info('Creating second backend...')
> topo.standalone.backends.create(None, properties={
BACKEND_NAME: "Second_Backend",
'suffix': "o=hang.com",
})

/export/tests/tickets/ticket49192_test.py:35:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:1169: in create
return co.create(rdn, properties, self._basedn)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.backend.Backend object at 0x7f61c25e7df0>, dn = None
properties = {'name': 'Second_Backend', 'suffix': 'o=hang.com'}
basedn = 'cn=ldbm database,cn=plugins,cn=config'

def create(self, dn=None, properties=None, basedn=DN_LDBM):
"""Add a new backend entry, create mapping tree,
and, if requested, sample entries

:param dn: DN of the new entry
:type dn: str
:param properties: Attributes and parameters for the new entry
:type properties: dict
:param basedn: Base DN of the new entry
:type basedn: str

:returns: DSLdapObject of the created entry
"""

sample_entries = False
parent_suffix = False

# normalize suffix (remove spaces between comps)
if dn is not None:
dn_comps = ldap.dn.explode_dn(dn.lower())
dn = ",".join(dn_comps)

if properties is not None:
> suffix_dn = properties['nsslapd-suffix'].lower()
E KeyError: 'nsslapd-suffix'

/usr/local/lib/python3.8/site-packages/lib389/backend.py:609: KeyError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49192_test:ticket49192_test.py:34 Creating second backend...
Failed tickets/ticket49287_test.py::test_ticket49287 16.11
self = <lib389.mappingTree.MappingTreeLegacy object at 0x7f61c1d3bca0>
suffix = 'dc=test,dc=com', bename = 'test', parent = None

def create(self, suffix=None, bename=None, parent=None):
'''
Create a mapping tree entry (under "cn=mapping tree,cn=config"),
for the 'suffix' and that is stored in 'bename' backend.
'bename' backend must exist before creating the mapping tree entry.

If a 'parent' is provided that means that we are creating a
sub-suffix mapping tree.

@param suffix - suffix mapped by this mapping tree entry. It will
be the common name ('cn') of the entry
@param benamebase - backend common name (e.g. 'userRoot')
@param parent - if provided is a parent suffix of 'suffix'

@return DN of the mapping tree entry

@raise ldap.NO_SUCH_OBJECT - if the backend entry or parent mapping
tree does not exist
ValueError - if missing a parameter,

'''
# Check suffix is provided
if not suffix:
raise ValueError("suffix is mandatory")
else:
nsuffix = normalizeDN(suffix)

# Check backend name is provided
if not bename:
raise ValueError("backend name is mandatory")

# Check that if the parent suffix is provided then
# it exists a mapping tree for it
if parent:
nparent = normalizeDN(parent)
filt = suffixfilt(parent)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
pass
except NoSuchEntryError:
raise ValueError("parent suffix has no mapping tree")
else:
nparent = ""

# Check if suffix exists, return
filt = suffixfilt(suffix)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
return entry
except ldap.NO_SUCH_OBJECT:
entry = None

#
# Now start the real work
#

# fix me when we can actually used escaped DNs
dn = ','.join(('cn="%s"' % nsuffix, DN_MAPPING_TREE))
entry = Entry(dn)
entry.update({
'objectclass': ['top', 'extensibleObject', MT_OBJECTCLASS_VALUE],
'nsslapd-state': 'backend',
# the value in the dn has to be DN escaped
# internal code will add the quoted value - unquoted value is
# useful for searching.
MT_PROPNAME_TO_ATTRNAME[MT_SUFFIX]: nsuffix,
MT_PROPNAME_TO_ATTRNAME[MT_BACKEND]: bename
})

# possibly add the parent
if parent:
entry.setValues(MT_PROPNAME_TO_ATTRNAME[MT_PARENT_SUFFIX], nparent)

try:
self.log.debug("Creating entry: %s", entry.dn)
self.log.info("Entry %r", entry)
> self.conn.add_s(entry)

/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py:155:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (dn: cn="dc=test,dc=com",cn=mapping tree,cn=config
cn: dc=test,dc=com
nsslapd-backend: test
nsslapd-state: backend
objectclass: top
objectclass: extensibleObject
objectclass: nsMappingTree

,)
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x7f61c2f95c40, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 176,...neno=187, function='_multicall', code_context=[' res = hook_impl.function(*args)\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b77742c0, file '/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py', line 15.../lib389/mappingTree.py', lineno=155, function='create', code_context=[' self.conn.add_s(entry)\n'], index=0)
ent = dn: cn="dc=test,dc=com",cn=mapping tree,cn=config
cn: dc=test,dc=com
nsslapd-backend: test
nsslapd-state: backend
objectclass: top
objectclass: extensibleObject
objectclass: nsMappingTree



def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
> return f(ent.dn, ent.toTupleList(), *args[2:])

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>
dn = 'cn="dc=test,dc=com",cn=mapping tree,cn=config'
modlist = [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=test,dc=com']), ('nsslapd-backend', [b'test'])]

def add_s(self,dn,modlist):
> return self.add_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('cn="dc=test,dc=com",cn=mapping tree,cn=config', [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=test,dc=com']), ('nsslapd-backend', [b'test'])], None, None)
kwargs = {}, ent = 'cn="dc=test,dc=com",cn=mapping tree,cn=config'

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:178:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>
dn = 'cn="dc=test,dc=com",cn=mapping tree,cn=config'
modlist = [('objectclass', [b'top', b'extensibleObject', b'nsMappingTree']), ('nsslapd-state', [b'backend']), ('cn', [b'dc=test,dc=com']), ('nsslapd-backend', [b'test'])]
serverctrls = None, clientctrls = None

def add_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.add_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (4,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>, msgid = 4, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (4, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>, msgid = 4, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c247da20>, 4, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>
func = <built-in method result4 of LDAP object at 0x7f61c247da20>
args = (4, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.UNWILLING_TO_PERFORM'>
exc_value = UNWILLING_TO_PERFORM({'msgtype': 105, 'msgid': 4, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []})
exc_traceback = <traceback object at 0x7f61c2348480>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c2360b50>
func = <built-in method result4 of LDAP object at 0x7f61c247da20>
args = (4, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.UNWILLING_TO_PERFORM: {'msgtype': 105, 'msgid': 4, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: UNWILLING_TO_PERFORM

During handling of the above exception, another exception occurred:

topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2360c10>

def test_ticket49287(topology_m2):
"""
test case for memberof and conflict entries

"""

# return
M1 = topology_m2.ms["master1"]
M2 = topology_m2.ms["master2"]

config_memberof(M1)
config_memberof(M2)

_enable_spec_logging(M1)
_enable_spec_logging(M2)

_disable_nunc_stans(M1)
_disable_nunc_stans(M2)

M1.restart(timeout=10)
M2.restart(timeout=10)

testbase = 'dc=test,dc=com'
bename = 'test'
> create_backend(M1, M2, testbase, bename)

/export/tests/tickets/ticket49287_test.py:282:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket49287_test.py:204: in create_backend
s1.mappingtree.create(beSuffix, beName)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.mappingTree.MappingTreeLegacy object at 0x7f61c1d3bca0>
suffix = 'dc=test,dc=com', bename = 'test', parent = None

def create(self, suffix=None, bename=None, parent=None):
'''
Create a mapping tree entry (under "cn=mapping tree,cn=config"),
for the 'suffix' and that is stored in 'bename' backend.
'bename' backend must exist before creating the mapping tree entry.

If a 'parent' is provided that means that we are creating a
sub-suffix mapping tree.

@param suffix - suffix mapped by this mapping tree entry. It will
be the common name ('cn') of the entry
@param benamebase - backend common name (e.g. 'userRoot')
@param parent - if provided is a parent suffix of 'suffix'

@return DN of the mapping tree entry

@raise ldap.NO_SUCH_OBJECT - if the backend entry or parent mapping
tree does not exist
ValueError - if missing a parameter,

'''
# Check suffix is provided
if not suffix:
raise ValueError("suffix is mandatory")
else:
nsuffix = normalizeDN(suffix)

# Check backend name is provided
if not bename:
raise ValueError("backend name is mandatory")

# Check that if the parent suffix is provided then
# it exists a mapping tree for it
if parent:
nparent = normalizeDN(parent)
filt = suffixfilt(parent)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
pass
except NoSuchEntryError:
raise ValueError("parent suffix has no mapping tree")
else:
nparent = ""

# Check if suffix exists, return
filt = suffixfilt(suffix)
try:
entry = self.conn.getEntry(DN_MAPPING_TREE, ldap.SCOPE_SUBTREE,
filt)
return entry
except ldap.NO_SUCH_OBJECT:
entry = None

#
# Now start the real work
#

# fix me when we can actually used escaped DNs
dn = ','.join(('cn="%s"' % nsuffix, DN_MAPPING_TREE))
entry = Entry(dn)
entry.update({
'objectclass': ['top', 'extensibleObject', MT_OBJECTCLASS_VALUE],
'nsslapd-state': 'backend',
# the value in the dn has to be DN escaped
# internal code will add the quoted value - unquoted value is
# useful for searching.
MT_PROPNAME_TO_ATTRNAME[MT_SUFFIX]: nsuffix,
MT_PROPNAME_TO_ATTRNAME[MT_BACKEND]: bename
})

# possibly add the parent
if parent:
entry.setValues(MT_PROPNAME_TO_ATTRNAME[MT_PARENT_SUFFIX], nparent)

try:
self.log.debug("Creating entry: %s", entry.dn)
self.log.info("Entry %r", entry)
self.conn.add_s(entry)
except ldap.LDAPError as e:
> raise ldap.LDAPError("Error adding suffix entry " + dn, e)
E ldap.LDAPError: ('Error adding suffix entry cn="dc=test,dc=com",cn=mapping tree,cn=config', UNWILLING_TO_PERFORM({'msgtype': 105, 'msgid': 4, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []}))

/usr/local/lib/python3.8/site-packages/lib389/mappingTree.py:157: LDAPError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e34eebf6-f09d-4c3c-b847-ce27989e9334 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 1addb0ca-5bfd-4516-b13b-c3ba7770e997 / got description=e34eebf6-f09d-4c3c-b847-ce27989e9334) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49287_test:ticket49287_test.py:77 update cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal INFO  tests.tickets.ticket49287_test:ticket49287_test.py:77 update cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=test,dc=com",cn=mapping tree,cn=config cn: dc=test,dc=com nsslapd-backend: test nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree
Failed tickets/ticket49303_test.py::test_ticket49303 17.15
topo = <lib389.topologies.TopologyMain object at 0x7f61c1d10460>

def test_ticket49303(topo):
"""
Test the nsTLSAllowClientRenegotiation setting.
"""
sslport = SECUREPORT_STANDALONE1

log.info("Ticket 49303 - Allow disabling of SSL renegotiation")

# No value set, defaults to reneg allowed
enable_ssl(topo.standalone, sslport)
> assert try_reneg(HOST_STANDALONE1, sslport) is True
E AssertionError: assert False is True
E + where False = try_reneg('LOCALHOST', 63601)

/export/tests/tickets/ticket49303_test.py:88: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49303_test:ticket49303_test.py:84 Ticket 49303 - Allow disabling of SSL renegotiation
Failed tickets/ticket49412_test.py::test_ticket49412 0.00
topo = <lib389.topologies.TopologyMain object at 0x7f61c2357730>

def test_ticket49412(topo):
"""Specify a test case purpose or name here

:id: 4c7681ff-0511-4256-9589-bdcad84c13e6
:setup: Fill in set up configuration here
:steps:
1. Fill in test case steps here
2. And indent them like this (RST format requirement)
:expectedresults:
1. Fill in the result that is expected
2. For each test step
"""

M1 = topo.ms["master1"]

# wrong call with invalid value (should be str(60)
# that create replace with NULL value
# it should fail with UNWILLING_TO_PERFORM
try:
> M1.modify_s(CHANGELOG, [(ldap.MOD_REPLACE, MAXAGE_ATTR, 60),
(ldap.MOD_REPLACE, TRIMINTERVAL, 10)])

/export/tests/tickets/ticket49412_test.py:44:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640: in modify_s
return self.modify_ext_s(dn,modlist,None,None)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c237be20>
func = <built-in method result4 of LDAP object at 0x7f61c2357a80>
args = (39, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.NO_SUCH_OBJECT: {'msgtype': 103, 'msgid': 39, 'result': 32, 'desc': 'No such object', 'ctrls': []}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: NO_SUCH_OBJECT
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 2df3687c-fe15-4455-b29a-bf8ba4690a1e / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
Failed tickets/ticket49463_test.py::test_ticket_49463 277.40
topo = <lib389.topologies.TopologyMain object at 0x7f61c25b1a90>

def test_ticket_49463(topo):
"""Specify a test case purpose or name here

:id: 2a68e8be-387d-4ac7-9452-1439e8483c13
:setup: Fill in set up configuration here
:steps:
1. Enable fractional replication
2. Enable replication logging
3. Check that replication is working fine
4. Generate skipped updates to create keep alive entries
5. Remove M3 from the topology
6. issue cleanAllRuv FORCE that will run on M1 then propagated M2 and M4
7. Check that Number DEL keep alive '3' is <= 1
8. Check M1 is the originator of cleanAllRuv and M2/M4 the propagated ones
9. Check replication M1,M2 and M4 can recover
10. Remove M4 from the topology
11. Issue cleanAllRuv not force while M2 is stopped (that hangs the cleanAllRuv)
12. Check that nsds5ReplicaCleanRUV is correctly encoded on M1 (last value: 1)
13. Check that nsds5ReplicaCleanRUV encoding survives M1 restart
14. Check that nsds5ReplicaCleanRUV encoding is valid on M2 (last value: 0)
15. Check that (for M4 cleanAllRUV) M1 is Originator and M2 propagation
:expectedresults:
1. No report of failure when the RUV is updated
"""

# Step 1 - Configure fractional (skip telephonenumber) replication
M1 = topo.ms["master1"]
M2 = topo.ms["master2"]
M3 = topo.ms["master3"]
M4 = topo.ms["master4"]
repl = ReplicationManager(DEFAULT_SUFFIX)
fractional_server_to_replica(M1, M2)
fractional_server_to_replica(M1, M3)
fractional_server_to_replica(M1, M4)

fractional_server_to_replica(M2, M1)
fractional_server_to_replica(M2, M3)
fractional_server_to_replica(M2, M4)

fractional_server_to_replica(M3, M1)
fractional_server_to_replica(M3, M2)
fractional_server_to_replica(M3, M4)

fractional_server_to_replica(M4, M1)
fractional_server_to_replica(M4, M2)
fractional_server_to_replica(M4, M3)

# Step 2 - enable internal op logging and replication debug
for i in (M1, M2, M3, M4):
i.config.loglevel(vals=[256 + 4], service='access')
i.config.loglevel(vals=[LOG_REPLICA, LOG_DEFAULT], service='error')

# Step 3 - Check that replication is working fine
add_user(M1, 11, desc="add to M1")
add_user(M2, 21, desc="add to M2")
add_user(M3, 31, desc="add to M3")
add_user(M4, 41, desc="add to M4")

for i in (M1, M2, M3, M4):
for j in (M1, M2, M3, M4):
if i == j:
continue
repl.wait_for_replication(i, j)

# Step 4 - Generate skipped updates to create keep alive entries
for i in (M1, M2, M3, M4):
cn = '%s_%d' % (USER_CN, 11)
dn = 'uid=%s,ou=People,%s' % (cn, SUFFIX)
users = UserAccount(i, dn)
for j in range(110):
users.set('telephoneNumber', str(j))

# Step 5 - Remove M3 from the topology
M3.stop()
M1.agreement.delete(suffix=SUFFIX, consumer_host=M3.host, consumer_port=M3.port)
M2.agreement.delete(suffix=SUFFIX, consumer_host=M3.host, consumer_port=M3.port)
M4.agreement.delete(suffix=SUFFIX, consumer_host=M3.host, consumer_port=M3.port)

# Step 6 - Then issue cleanAllRuv FORCE that will run on M1, M2 and M4
M1.tasks.cleanAllRUV(suffix=SUFFIX, replicaid='3',
force=True, args={TASK_WAIT: True})

# Step 7 - Count the number of received DEL of the keep alive 3
for i in (M1, M2, M4):
i.restart()
regex = re.compile(".*DEL dn=.cn=repl keep alive 3.*")
for i in (M1, M2, M4):
count = count_pattern_accesslog(M1, regex)
log.debug("count on %s = %d" % (i, count))

# check that DEL is replicated once (If DEL is kept in the fix)
# check that DEL is is not replicated (If DEL is finally no long done in the fix)
assert ((count == 1) or (count == 0))

# Step 8 - Check that M1 is Originator of cleanAllRuv and M2, M4 propagation
regex = re.compile(".*Original task deletes Keep alive entry .3.*")
assert pattern_errorlog(M1, regex)

regex = re.compile(".*Propagated task does not delete Keep alive entry .3.*")
assert pattern_errorlog(M2, regex)
assert pattern_errorlog(M4, regex)

# Step 9 - Check replication M1,M2 and M4 can recover
add_user(M1, 12, desc="add to M1")
add_user(M2, 22, desc="add to M2")
for i in (M1, M2, M4):
for j in (M1, M2, M4):
if i == j:
continue
repl.wait_for_replication(i, j)

# Step 10 - Remove M4 from the topology
M4.stop()
M1.agreement.delete(suffix=SUFFIX, consumer_host=M4.host, consumer_port=M4.port)
M2.agreement.delete(suffix=SUFFIX, consumer_host=M4.host, consumer_port=M4.port)

# Step 11 - Issue cleanAllRuv not force while M2 is stopped (that hangs the cleanAllRuv)
M2.stop()
M1.tasks.cleanAllRUV(suffix=SUFFIX, replicaid='4',
force=False, args={TASK_WAIT: False})

# Step 12
# CleanAllRuv is hanging waiting for M2 to restart
# Check that nsds5ReplicaCleanRUV is correctly encoded on M1
replicas = Replicas(M1)
replica = replicas.list()[0]
time.sleep(0.5)
replica.present('nsds5ReplicaCleanRUV')
log.info("M1: nsds5ReplicaCleanRUV=%s" % replica.get_attr_val_utf8('nsds5replicacleanruv'))
regex = re.compile("^4:.*:no:1$")
> assert regex.match(replica.get_attr_val_utf8('nsds5replicacleanruv'))
E AssertionError: assert None
E + where None = <built-in method match of re.Pattern object at 0x7f61c2cd2650>('4:no:1:dc=example,dc=com')
E + where <built-in method match of re.Pattern object at 0x7f61c2cd2650> = re.compile('^4:.*:no:1$').match
E + and '4:no:1:dc=example,dc=com' = <bound method DSLdapObject.get_attr_val_utf8 of <lib389.replica.Replica object at 0x7f61c25ab4c0>>('nsds5replicacleanruv')
E + where <bound method DSLdapObject.get_attr_val_utf8 of <lib389.replica.Replica object at 0x7f61c25ab4c0>> = <lib389.replica.Replica object at 0x7f61c25ab4c0>.get_attr_val_utf8

/export/tests/tickets/ticket49463_test.py:188: AssertionError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c4699a07-f205-4259-ad9a-a80918358b6c / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect df454208-a46e-4ed8-847e-438057b48158 / got description=c4699a07-f205-4259-ad9a-a80918358b6c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 11d6075a-dc06-4f32-a927-f4a882df0ced / got description=df454208-a46e-4ed8-847e-438057b48158) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 258edeae-e6e6-47fd-ae26-909f2b52da6d / got description=11d6075a-dc06-4f32-a927-f4a882df0ced) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 258edeae-e6e6-47fd-ae26-909f2b52da6d / got description=11d6075a-dc06-4f32-a927-f4a882df0ced) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 258edeae-e6e6-47fd-ae26-909f2b52da6d / got description=11d6075a-dc06-4f32-a927-f4a882df0ced) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect d0a9924b-4331-4b59-84dd-9d9673867be2 / got description=258edeae-e6e6-47fd-ae26-909f2b52da6d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0ec0de1d-797d-4862-ae52-4c00fd1e1aa1 / got description=d0a9924b-4331-4b59-84dd-9d9673867be2) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a644ee8c-42a2-4dfd-90ce-c27331ea2ac6 / got description=0ec0de1d-797d-4862-ae52-4c00fd1e1aa1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 0cc64442-2f62-4e8b-a8a7-dcf8097580bb / got description=a644ee8c-42a2-4dfd-90ce-c27331ea2ac6) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 20340fab-e346-4f4f-8f49-68e5035978ed / got description=0cc64442-2f62-4e8b-a8a7-dcf8097580bb) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 20340fab-e346-4f4f-8f49-68e5035978ed / got description=0cc64442-2f62-4e8b-a8a7-dcf8097580bb) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ff9d328e-1863-443a-9460-119804cf4769 / got description=20340fab-e346-4f4f-8f49-68e5035978ed) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ff9d328e-1863-443a-9460-119804cf4769 / got description=20340fab-e346-4f4f-8f49-68e5035978ed) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ff9d328e-1863-443a-9460-119804cf4769 / got description=20340fab-e346-4f4f-8f49-68e5035978ed) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ff9d328e-1863-443a-9460-119804cf4769 / got description=20340fab-e346-4f4f-8f49-68e5035978ed) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ff9d328e-1863-443a-9460-119804cf4769 / got description=20340fab-e346-4f4f-8f49-68e5035978ed) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 6a26a30b-4a70-4e27-92fa-18c5f3f6d8d8 / got description=ff9d328e-1863-443a-9460-119804cf4769) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 7a182060-700e-41da-8fe4-123b056429c7 / got description=6a26a30b-4a70-4e27-92fa-18c5f3f6d8d8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect c5198d23-5cb8-4986-9f78-01dacfde8458 / got description=7a182060-700e-41da-8fe4-123b056429c7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4459fd3b-dc2f-44b2-af46-b7a36efa6719 / got description=c5198d23-5cb8-4986-9f78-01dacfde8458) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 91e6baa7-64c2-4abe-8259-5d7a19f6beee / got description=c5198d23-5cb8-4986-9f78-01dacfde8458) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 91e6baa7-64c2-4abe-8259-5d7a19f6beee / got description=4459fd3b-dc2f-44b2-af46-b7a36efa6719) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 97bd007d-1190-4a5c-b617-2499aa6f7157 / got description=91e6baa7-64c2-4abe-8259-5d7a19f6beee) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d912da07-83c4-48a9-a454-1c562ab33738 / got description=97bd007d-1190-4a5c-b617-2499aa6f7157) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d912da07-83c4-48a9-a454-1c562ab33738 / got description=97bd007d-1190-4a5c-b617-2499aa6f7157) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d912da07-83c4-48a9-a454-1c562ab33738 / got description=97bd007d-1190-4a5c-b617-2499aa6f7157) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 5984cc20-d507-4882-b0ad-345405d47688 / got description=d912da07-83c4-48a9-a454-1c562ab33738) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389:agreement.py:1095 Agreement (cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:agreement.py:1095 Agreement (cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:agreement.py:1095 Agreement (cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:tasks.py:1400 cleanAllRUV task (task-10312020_004732) completed successfully INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 27b3a275-12e7-4b6d-ac6e-ef0514b38e13 / got description=5984cc20-d507-4882-b0ad-345405d47688) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 128f3ea2-935c-4b39-a2f0-06400b861b3c / got description=27b3a275-12e7-4b6d-ac6e-ef0514b38e13) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect da43998b-5c61-4e16-81a2-6617693d4c98 / got description=128f3ea2-935c-4b39-a2f0-06400b861b3c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect da43998b-5c61-4e16-81a2-6617693d4c98 / got description=128f3ea2-935c-4b39-a2f0-06400b861b3c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect da43998b-5c61-4e16-81a2-6617693d4c98 / got description=128f3ea2-935c-4b39-a2f0-06400b861b3c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect da43998b-5c61-4e16-81a2-6617693d4c98 / got description=128f3ea2-935c-4b39-a2f0-06400b861b3c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 24c6ad35-d4ed-41a2-8c29-eb3d36b1d6f7 / got description=da43998b-5c61-4e16-81a2-6617693d4c98) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 1e49be48-a775-4d2d-a0b7-ff027540922d / got description=da43998b-5c61-4e16-81a2-6617693d4c98) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 1e49be48-a775-4d2d-a0b7-ff027540922d / got description=da43998b-5c61-4e16-81a2-6617693d4c98) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 1e49be48-a775-4d2d-a0b7-ff027540922d / got description=da43998b-5c61-4e16-81a2-6617693d4c98) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e6481198-70de-47ab-a487-3163b555f679 / got description=1e49be48-a775-4d2d-a0b7-ff027540922d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389:agreement.py:1095 Agreement (cn=004,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:agreement.py:1095 Agreement (cn=004,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:tasks.py:1400 cleanAllRUV task (task-10312020_004825) completed successfully INFO  lib389.utils:ticket49463_test.py:186 M1: nsds5ReplicaCleanRUV=4:no:1:dc=example,dc=com
Failed tickets/ticket50232_test.py::test_ticket50232_normal 0.72
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1ad7ac0>

def test_ticket50232_normal(topology_st):
"""
The fix for ticket 50232


The test sequence is:
- create suffix
- add suffix entry and some child entries
- "normally" done after populating suffix: enable replication
- get RUV and database generation
- export -r
- import
- get RUV and database generation
- assert database generation has not changed
"""

log.info('Testing Ticket 50232 - export creates not imprtable ldif file, normal creation order')

topology_st.standalone.backend.create(NORMAL_SUFFIX, {BACKEND_NAME: NORMAL_BACKEND_NAME})
topology_st.standalone.mappingtree.create(NORMAL_SUFFIX, bename=NORMAL_BACKEND_NAME, parent=None)

_populate_suffix(topology_st.standalone, NORMAL_BACKEND_NAME)

repl = ReplicationManager(DEFAULT_SUFFIX)
> repl._ensure_changelog(topology_st.standalone)

/export/tests/tickets/ticket50232_test.py:113:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/replica.py:1928: in _ensure_changelog
cl.create(properties={
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:971: in create
return self._create(rdn, properties, basedn, ensure=False)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:946: in _create
self._instance.add_ext_s(e, serverctrls=self._server_controls, clientctrls=self._client_controls, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176: in inner
return f(ent.dn, ent.toTupleList(), *args[2:])
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425: in add_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1ad75b0>
func = <built-in method result4 of LDAP object at 0x7f61c18730f0>
args = (13, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.UNWILLING_TO_PERFORM: {'msgtype': 105, 'msgid': 13, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': [], 'info': 'Changelog configuration is part of the backend configuration'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: UNWILLING_TO_PERFORM
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=normal INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=normal,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=normal,cn=ldbm database,cn=plugins,cn=config cn: normal nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/normal nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=normal objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=normal",cn=mapping tree,cn=config cn: o=normal nsslapd-backend: normal nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dnormal,cn=mapping tree,cn=config cn: o=normal nsslapd-backend: normal nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Failed tickets/ticket50232_test.py::test_ticket50232_reverse 0.31
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1ad7ac0>

def test_ticket50232_reverse(topology_st):
"""
The fix for ticket 50232


The test sequence is:
- create suffix
- enable replication before suffix enztry is added
- add suffix entry and some child entries
- get RUV and database generation
- export -r
- import
- get RUV and database generation
- assert database generation has not changed
"""

log.info('Testing Ticket 50232 - export creates not imprtable ldif file, normal creation order')

#
# Setup Replication
#
log.info('Setting up replication...')
repl = ReplicationManager(DEFAULT_SUFFIX)
# repl.create_first_master(topology_st.standalone)
#
# enable dynamic plugins, memberof and retro cl plugin
#
topology_st.standalone.backend.create(REVERSE_SUFFIX, {BACKEND_NAME: REVERSE_BACKEND_NAME})
topology_st.standalone.mappingtree.create(REVERSE_SUFFIX, bename=REVERSE_BACKEND_NAME, parent=None)

> _enable_replica(topology_st.standalone, REVERSE_SUFFIX)

/export/tests/tickets/ticket50232_test.py:155:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/export/tests/tickets/ticket50232_test.py:35: in _enable_replica
repl._ensure_changelog(instance)
/usr/local/lib/python3.8/site-packages/lib389/replica.py:1928: in _ensure_changelog
cl.create(properties={
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:971: in create
return self._create(rdn, properties, basedn, ensure=False)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:946: in _create
self._instance.add_ext_s(e, serverctrls=self._server_controls, clientctrls=self._client_controls, escapehatch='i am sure')
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:176: in inner
return f(ent.dn, ent.toTupleList(), *args[2:])
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:425: in add_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1ad75b0>
func = <built-in method result4 of LDAP object at 0x7f61c18730f0>
args = (22, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.UNWILLING_TO_PERFORM: {'msgtype': 105, 'msgid': 22, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': [], 'info': 'Changelog configuration is part of the backend configuration'}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: UNWILLING_TO_PERFORM
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=reverse INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=reverse,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=reverse,cn=ldbm database,cn=plugins,cn=config cn: reverse nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/reverse nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=reverse objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=reverse",cn=mapping tree,cn=config cn: o=reverse nsslapd-backend: reverse nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dreverse,cn=mapping tree,cn=config cn: o=reverse nsslapd-backend: reverse nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Failed tickets/ticket548_test.py::test_ticket548_test_with_no_policy 0.10
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1c70f10>

def test_ticket548_test_with_no_policy(topology_st):
"""
Check shadowAccount under no password policy
"""
log.info("Case 1. No password policy")

log.info("Bind as %s" % DN_DM)
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)

log.info('Add an entry' + USER1_DN)
try:
topology_st.standalone.add_s(
Entry((USER1_DN, {'objectclass': "top person organizationalPerson inetOrgPerson shadowAccount".split(),
'sn': '1',
'cn': 'user 1',
'uid': 'user1',
'givenname': 'user',
'mail': 'user1@' + DEFAULT_SUFFIX,
'userpassword': USER_PW})))
except ldap.LDAPError as e:
log.fatal('test_ticket548: Failed to add user' + USER1_DN + ': error ' + e.message['desc'])
assert False

edate = int(time.time() / (60 * 60 * 24))
log.info('Search entry %s' % USER1_DN)

log.info("Bind as %s" % USER1_DN)
topology_st.standalone.simple_bind_s(USER1_DN, USER_PW)
> entry = topology_st.standalone.getEntry(USER1_DN, ldap.SCOPE_BASE, "(objectclass=*)", ['shadowLastChange'])

/export/tests/tickets/ticket548_test.py:211:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
args = ('uid=user1,dc=example,dc=com', 0, '(objectclass=*)', ['shadowLastChange'])
kwargs = {}, res = 6, restype = 101, obj = []

def getEntry(self, *args, **kwargs):
"""Wrapper around SimpleLDAPObject.search. It is common to just get
one entry.
@param - entry dn
@param - search scope, in ldap.SCOPE_BASE (default),
ldap.SCOPE_SUB, ldap.SCOPE_ONE
@param filterstr - filterstr, default '(objectClass=*)' from
SimpleLDAPObject
@param attrlist - list of attributes to retrieve. eg ['cn', 'uid']
@oaram attrsonly - default None from SimpleLDAPObject
eg. getEntry(dn, scope, filter, attributes)

XXX This cannot return None
"""
self.log.debug("Retrieving entry with %r", [args])
if len(args) == 1 and 'scope' not in kwargs:
args += (ldap.SCOPE_BASE, )

res = self.search(*args, **kwargs)
restype, obj = self.result(res)
# TODO: why not test restype?
if not obj:
> raise NoSuchEntryError("no such entry for %r", [args])
E lib389.exceptions.NoSuchEntryError: ('no such entry for %r', [('uid=user1,dc=example,dc=com', 0, '(objectclass=*)', ['shadowLastChange'])])

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:1700: NoSuchEntryError
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Failed tickets/ticket548_test.py::test_ticket548_test_global_policy 0.16
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1c70f10>

def test_ticket548_test_global_policy(topology_st):
"""
Check shadowAccount with global password policy
"""

log.info("Case 2. Check shadowAccount with global password policy")

log.info("Bind as %s" % DN_DM)
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)

set_global_pwpolicy(topology_st)

log.info('Add an entry' + USER2_DN)
try:
topology_st.standalone.add_s(
Entry((USER2_DN, {'objectclass': "top person organizationalPerson inetOrgPerson shadowAccount".split(),
'sn': '2',
'cn': 'user 2',
'uid': 'user2',
'givenname': 'user',
'mail': 'user2@' + DEFAULT_SUFFIX,
'userpassword': USER_PW})))
except ldap.LDAPError as e:
log.fatal('test_ticket548: Failed to add user' + USER2_DN + ': error ' + e.message['desc'])
assert False

edate = int(time.time() / (60 * 60 * 24))

log.info("Bind as %s" % USER1_DN)
topology_st.standalone.simple_bind_s(USER1_DN, USER_PW)

log.info('Search entry %s' % USER1_DN)
> entry = topology_st.standalone.getEntry(USER1_DN, ldap.SCOPE_BASE, "(objectclass=*)")

/export/tests/tickets/ticket548_test.py:249:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
args = ('uid=user1,dc=example,dc=com', 0, '(objectclass=*)'), kwargs = {}
res = 15, restype = 101, obj = []

def getEntry(self, *args, **kwargs):
"""Wrapper around SimpleLDAPObject.search. It is common to just get
one entry.
@param - entry dn
@param - search scope, in ldap.SCOPE_BASE (default),
ldap.SCOPE_SUB, ldap.SCOPE_ONE
@param filterstr - filterstr, default '(objectClass=*)' from
SimpleLDAPObject
@param attrlist - list of attributes to retrieve. eg ['cn', 'uid']
@oaram attrsonly - default None from SimpleLDAPObject
eg. getEntry(dn, scope, filter, attributes)

XXX This cannot return None
"""
self.log.debug("Retrieving entry with %r", [args])
if len(args) == 1 and 'scope' not in kwargs:
args += (ldap.SCOPE_BASE, )

res = self.search(*args, **kwargs)
restype, obj = self.result(res)
# TODO: why not test restype?
if not obj:
> raise NoSuchEntryError("no such entry for %r", [args])
E lib389.exceptions.NoSuchEntryError: ('no such entry for %r', [('uid=user1,dc=example,dc=com', 0, '(objectclass=*)')])

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:1700: NoSuchEntryError
Failed tickets/ticket548_test.py::test_ticket548_test_subtree_policy 2.21
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1c70f10>
user = 'uid=user3,dc=example,dc=com', passwd = 'password'
newpasswd = 'password0'

def update_passwd(topology_st, user, passwd, newpasswd):
log.info(" Bind as {%s,%s}" % (user, passwd))
topology_st.standalone.simple_bind_s(user, passwd)
try:
> topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', newpasswd.encode())])

/export/tests/tickets/ticket548_test.py:160:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user3,dc=example,dc=com', [(2, 'userpassword', b'password0')])
kwargs = {}
c_stack = [FrameInfo(frame=<frame at 0x7f61c2d2d440, file '/usr/local/lib/python3.8/site-packages/lib389/__init__.py', line 180,...mbda>', code_context=[' self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(\n'], index=0), ...]
frame = FrameInfo(frame=<frame at 0x5576b8e65520, file '/export/tests/tickets/ticket548_test.py', line 164, code update_passwd...[" topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', newpasswd.encode())])\n"], index=0)

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
dn = 'uid=user3,dc=example,dc=com'
modlist = [(2, 'userpassword', b'password0')]

def modify_s(self,dn,modlist):
> return self.modify_ext_s(dn,modlist,None,None)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:640:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = ('uid=user3,dc=example,dc=com', [(2, 'userpassword', b'password0')], None, None)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
dn = 'uid=user3,dc=example,dc=com'
modlist = [(2, 'userpassword', b'password0')], serverctrls = None
clientctrls = None

def modify_ext_s(self,dn,modlist,serverctrls=None,clientctrls=None):
msgid = self.modify_ext(dn,modlist,serverctrls,clientctrls)
> resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (34,), kwargs = {'all': 1, 'timeout': -1}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>, msgid = 34, all = 1
timeout = -1, resp_ctrl_classes = None

def result3(self,msgid=ldap.RES_ANY,all=1,timeout=None,resp_ctrl_classes=None):
> resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
msgid,all,timeout,
add_ctrls=0,add_intermediates=0,add_extop=0,
resp_ctrl_classes=resp_ctrl_classes
)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (34, 1, -1)
kwargs = {'add_ctrls': 0, 'add_extop': 0, 'add_intermediates': 0, 'resp_ctrl_classes': None}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>, msgid = 34, all = 1
timeout = -1, add_ctrls = 0, add_intermediates = 0, add_extop = 0
resp_ctrl_classes = None

def result4(self,msgid=ldap.RES_ANY,all=1,timeout=None,add_ctrls=0,add_intermediates=0,add_extop=0,resp_ctrl_classes=None):
if timeout is None:
timeout = self.timeout
> ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

args = (<built-in method result4 of LDAP object at 0x7f61c1ac2d20>, 34, 1, -1, 0, 0, ...)
kwargs = {}

def inner(*args, **kwargs):
if name in [
'add_s',
'bind_s',
'delete_s',
'modify_s',
'modrdn_s',
'rename_s',
'sasl_interactive_bind_s',
'search_s',
'search_ext_s',
'simple_bind_s',
'unbind_s',
'getEntry',
] and not ('escapehatch' in kwargs and kwargs['escapehatch'] == 'i am sure'):
c_stack = inspect.stack()
frame = c_stack[1]

warnings.warn(DeprecationWarning("Use of raw ldap function %s. This will be removed in a future release. "
"Found in: %s:%s" % (name, frame.filename, frame.lineno)))
# Later, we will add a sleep here to make it even more painful.
# Finally, it will raise an exception.
elif 'escapehatch' in kwargs:
kwargs.pop('escapehatch')

if name == 'result':
objtype, data = f(*args, **kwargs)
# data is either a 2-tuple or a list of 2-tuples
# print data
if data:
if isinstance(data, tuple):
return objtype, Entry(data)
elif isinstance(data, list):
# AD sends back these search references
# if objtype == ldap.RES_SEARCH_RESULT and \
# isinstance(data[-1],tuple) and \
# not data[-1][0]:
# print "Received search reference: "
# pprint.pprint(data[-1][1])
# data.pop() # remove the last non-entry element

return objtype, [Entry(x) for x in data]
else:
raise TypeError("unknown data type %s returned by result" %
type(data))
else:
return objtype, data
elif name.startswith('add'):
# the first arg is self
# the second and third arg are the dn and the data to send
# We need to convert the Entry into the format used by
# python-ldap
ent = args[0]
if isinstance(ent, Entry):
return f(ent.dn, ent.toTupleList(), *args[2:])
else:
return f(*args, **kwargs)
else:
> return f(*args, **kwargs)

/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
func = <built-in method result4 of LDAP object at 0x7f61c1ac2d20>
args = (34, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
result = func(*args,**kwargs)
if __debug__ and self._trace_level>=2:
if func.__name__!="unbind_ext":
diagnostic_message_success = self._l.get_option(ldap.OPT_DIAGNOSTIC_MESSAGE)
finally:
self._ldap_object_lock.release()
except LDAPError as e:
exc_type,exc_value,exc_traceback = sys.exc_info()
try:
if 'info' not in e.args[0] and 'errno' in e.args[0]:
e.args[0]['info'] = strerror(e.args[0]['errno'])
except IndexError:
pass
if __debug__ and self._trace_level>=2:
self._trace_file.write('=> LDAPError - %s: %s\n' % (e.__class__.__name__,str(e)))
try:
> reraise(exc_type, exc_value, exc_traceback)

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

exc_type = <class 'ldap.INSUFFICIENT_ACCESS'>
exc_value = INSUFFICIENT_ACCESS({'msgtype': 103, 'msgid': 34, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user3,dc=example,dc=com'.\n"})
exc_traceback = <traceback object at 0x7f61c1b9f7c0>

def reraise(exc_type, exc_value, exc_traceback):
"""Re-raise an exception given information from sys.exc_info()

Note that unlike six.reraise, this does not support replacing the
traceback. All arguments must come from a single sys.exc_info() call.
"""
# In Python 3, all exception info is contained in one object.
> raise exc_value

/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61c1c70d30>
func = <built-in method result4 of LDAP object at 0x7f61c1ac2d20>
args = (34, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 34, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'userPassword' attribute of entry 'uid=user3,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS

During handling of the above exception, another exception occurred:

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1c70f10>

def test_ticket548_test_subtree_policy(topology_st):
"""
Check shadowAccount with subtree level password policy
"""

log.info("Case 3. Check shadowAccount with subtree level password policy")

log.info("Bind as %s" % DN_DM)
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)
# Check the global policy values

set_subtree_pwpolicy(topology_st, 2, 20, 6)

log.info('Add an entry' + USER3_DN)
try:
topology_st.standalone.add_s(
Entry((USER3_DN, {'objectclass': "top person organizationalPerson inetOrgPerson shadowAccount".split(),
'sn': '3',
'cn': 'user 3',
'uid': 'user3',
'givenname': 'user',
'mail': 'user3@' + DEFAULT_SUFFIX,
'userpassword': USER_PW})))
except ldap.LDAPError as e:
log.fatal('test_ticket548: Failed to add user' + USER3_DN + ': error ' + e.message['desc'])
assert False

log.info('Search entry %s' % USER3_DN)
entry0 = topology_st.standalone.getEntry(USER3_DN, ldap.SCOPE_BASE, "(objectclass=*)")

log.info('Expecting shadowLastChange 0 since passwordMustChange is on')
check_shadow_attr_value(entry0, 'shadowLastChange', 0, USER3_DN)

# passwordMinAge -- 2 day
check_shadow_attr_value(entry0, 'shadowMin', 2, USER3_DN)

# passwordMaxAge -- 20 days
check_shadow_attr_value(entry0, 'shadowMax', 20, USER3_DN)

# passwordWarning -- 6 days
check_shadow_attr_value(entry0, 'shadowWarning', 6, USER3_DN)

log.info("Bind as %s" % USER3_DN)
topology_st.standalone.simple_bind_s(USER3_DN, USER_PW)

log.info('Search entry %s' % USER3_DN)
try:
entry1 = topology_st.standalone.getEntry(USER3_DN, ldap.SCOPE_BASE, "(objectclass=*)")
except ldap.UNWILLING_TO_PERFORM:
log.info('test_ticket548: Search by' + USER3_DN + ' failed by UNWILLING_TO_PERFORM as expected')
except ldap.LDAPError as e:
log.fatal('test_ticket548: Failed to serch user' + USER3_DN + ' by self: error ' + e.message['desc'])
assert False

log.info("Bind as %s and updating the password with a new one" % USER3_DN)
topology_st.standalone.simple_bind_s(USER3_DN, USER_PW)

# Bind as DM again, change policy
log.info("Bind as %s" % DN_DM)
topology_st.standalone.simple_bind_s(DN_DM, PASSWORD)

set_subtree_pwpolicy(topology_st, 4, 40, 12)

newpasswd = USER_PW + '0'
> update_passwd(topology_st, USER3_DN, USER_PW, newpasswd)

/export/tests/tickets/ticket548_test.py:372:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology_st = <lib389.topologies.TopologyMain object at 0x7f61c1c70f10>
user = 'uid=user3,dc=example,dc=com', passwd = 'password'
newpasswd = 'password0'

def update_passwd(topology_st, user, passwd, newpasswd):
log.info(" Bind as {%s,%s}" % (user, passwd))
topology_st.standalone.simple_bind_s(user, passwd)
try:
topology_st.standalone.modify_s(user, [(ldap.MOD_REPLACE, 'userpassword', newpasswd.encode())])
except ldap.LDAPError as e:
> log.fatal('test_ticket548: Failed to update the password ' + cpw + ' of user ' + user + ': error ' + e.message[
'desc'])
E NameError: name 'cpw' is not defined

/export/tests/tickets/ticket548_test.py:162: NameError
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_targattrfilters_18] 0.01
topo = <lib389.topologies.TopologyMain object at 0x7f61d4f190d0>
real_value = '(target = ldap:///cn=Jeff Vedder,ou=Product Development,dc=example,dc=com)(targetattr="*")(version 3.0; acl "Name of ...3123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123123";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:213: Failed
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_targattrfilters_20] 0.02
topo = <lib389.topologies.TopologyMain object at 0x7f61d4f190d0>
real_value = '(target = ldap:///cn=Jeff Vedder,ou=Product Development,dc=example,dc=com)(targetattr="*")(version 3.0; acl "Name of the ACI"; deny(write)userdns="ldap:///anyone";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:213: Failed
XFailed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_bind_rule_set_with_more_than_three] 0.01
topo = <lib389.topologies.TopologyMain object at 0x7f61d4f190d0>
real_value = '(target = ldap:///dc=example,dc=com)(targetattr="*")(version 3.0; acl "Name of the ACI"; deny absolute (all)userdn="ldap:////////anyone";)'

@pytest.mark.xfail(reason='https://bugzilla.redhat.com/show_bug.cgi?id=1691473')
@pytest.mark.parametrize("real_value", [a[1] for a in FAILED],
ids=[a[0] for a in FAILED])
def test_aci_invalid_syntax_fail(topo, real_value):
"""

Try to set wrong ACI syntax.

:id: 83c40784-fff5-49c8-9535-7064c9c19e7e
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Create ACI
2. Try to setup the ACI with Instance
:expectedresults:
1. It should pass
2. It should not pass
"""
domain = Domain(topo.standalone, DEFAULT_SUFFIX)
with pytest.raises(ldap.INVALID_SYNTAX):
> domain.add("aci", real_value)
E Failed: DID NOT RAISE <class 'ldap.INVALID_SYNTAX'>

suites/acl/syntax_test.py:213: Failed
XFailed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_3, CHILDREN)] 0.07
topo = <lib389.topologies.TopologyMain object at 0x7f61d6b6a5e0>
_add_user = None, user = 'uid=Grandparent,ou=Inheritance,dc=example,dc=com'
entry = 'ou=CHILDREN,ou=PARENTS,ou=GRANDPARENTS,ou=ANCESTORS,ou=Inheritance,dc=example,dc=com'

@pytest.mark.parametrize("user,entry", [
(CAN, ROLEDNACCESS),
(CAN, USERDNACCESS),
(CAN, GROUPDNACCESS),
(CAN, LDAPURLACCESS),
(CAN, ATTRNAMEACCESS),
(LEVEL_0, OU_2),
(LEVEL_1, ANCESTORS),
(LEVEL_2, GRANDPARENTS),
(LEVEL_4, OU_2),
(LEVEL_4, ANCESTORS),
(LEVEL_4, GRANDPARENTS),
(LEVEL_4, PARENTS),
(LEVEL_4, CHILDREN),
pytest.param(LEVEL_3, CHILDREN, marks=pytest.mark.xfail(reason="May be some bug")),
], ids=[
"(CAN,ROLEDNACCESS)",
"(CAN,USERDNACCESS)",
"(CAN,GROUPDNACCESS)",
"(CAN,LDAPURLACCESS)",
"(CAN,ATTRNAMEACCESS)",
"(LEVEL_0, OU_2)",
"(LEVEL_1,ANCESTORS)",
"(LEVEL_2,GRANDPARENTS)",
"(LEVEL_4,OU_2)",
"(LEVEL_4, ANCESTORS)",
"(LEVEL_4,GRANDPARENTS)",
"(LEVEL_4,PARENTS)",
"(LEVEL_4,CHILDREN)",
"(LEVEL_3, CHILDREN)"
])
def test_mod_see_also_positive(topo, _add_user, user, entry):
"""
Try to set seeAlso on entry with binding specific user, it will success
as per the ACI.

:id: 65745426-7a01-11e8-8ac2-8c16451d917b
:parametrized: yes
:setup: Standalone Instance
:steps:
1. Add test entry
2. Add ACI
3. User should follow ACI role
:expectedresults:
1. Entry should be added
2. Operation should succeed
3. Operation should succeed
"""
conn = UserAccount(topo.standalone, user).bind(PW_DM)
> UserAccount(conn, entry).replace('seeAlso', 'cn=1')

suites/acl/userattr_test.py:216:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:280: in replace
self.set(key, value, action=ldap.MOD_REPLACE)
/usr/local/lib/python3.8/site-packages/lib389/_mapped_object.py:446: in set
return self._instance.modify_ext_s(self._dn, [(action, key, value)],
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:613: in modify_ext_s
resp_type, resp_data, resp_msgid, resp_ctrls = self.result3(msgid,all=1,timeout=self.timeout)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:764: in result3
resp_type, resp_data, resp_msgid, decoded_resp_ctrls, retoid, retval = self.result4(
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:774: in result4
ldap_result = self._ldap_call(self._l.result4,msgid,all,timeout,add_ctrls,add_intermediates,add_extop)
/usr/local/lib/python3.8/site-packages/lib389/__init__.py:180: in inner
return f(*args, **kwargs)
/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:340: in _ldap_call
reraise(exc_type, exc_value, exc_traceback)
/usr/local/lib64/python3.8/site-packages/ldap/compat.py:46: in reraise
raise exc_value
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <lib389.DirSrv object at 0x7f61d3dc4d60>
func = <built-in method result4 of LDAP object at 0x7f61d5223450>
args = (5, 1, -1, 0, 0, 0), kwargs = {}, diagnostic_message_success = None
exc_type = None, exc_value = None, exc_traceback = None

def _ldap_call(self,func,*args,**kwargs):
"""
Wrapper method mainly for serializing calls into OpenLDAP libs
and trace logs
"""
self._ldap_object_lock.acquire()
if __debug__:
if self._trace_level>=1:
self._trace_file.write('*** %s %s - %s\n%s\n' % (
repr(self),
self._uri,
'.'.join((self.__class__.__name__,func.__name__)),
pprint.pformat((args,kwargs))
))
if self._trace_level>=9:
traceback.print_stack(limit=self._trace_stack_limit,file=self._trace_file)
diagnostic_message_success = None
try:
try:
> result = func(*args,**kwargs)
E ldap.INSUFFICIENT_ACCESS: {'msgtype': 103, 'msgid': 5, 'result': 50, 'desc': 'Insufficient access', 'ctrls': [], 'info': "Insufficient 'write' privilege to the 'seeAlso' attribute of entry 'ou=children,ou=parents,ou=grandparents,ou=ancestors,ou=inheritance,dc=example,dc=com'.\n"}

/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py:324: INSUFFICIENT_ACCESS
XFailed suites/config/config_test.py::test_defaultnamingcontext_1 0.31
topo = <lib389.topologies.TopologyMain object at 0x7f61d38f0fa0>

@pytest.mark.xfail(reason="This may fail due to bug 1610234")
def test_defaultnamingcontext_1(topo):
"""This test case should be part of function test_defaultnamingcontext
Please move it back after we have a fix for bug 1610234
"""
log.info("Remove the original suffix which is currently nsslapd-defaultnamingcontext"
"and check nsslapd-defaultnamingcontext become empty.")

""" Please remove these declarations after moving the test
to function test_defaultnamingcontext
"""
backends = Backends(topo.standalone)
test_db2 = 'test2_db'
test_suffix2 = 'dc=test2,dc=com'
b2 = backends.create(properties={'cn': test_db2,
'nsslapd-suffix': test_suffix2})
b2.delete()
> assert topo.standalone.config.get_attr_val_utf8('nsslapd-defaultnamingcontext') == ' '
E AssertionError: assert 'dc=example,dc=com' == ' '
E Strings contain only whitespace, escaping them using repr()
E - ' '
E + 'dc=example,dc=com'

suites/config/config_test.py:280: AssertionError
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:268 Remove the original suffix which is currently nsslapd-defaultnamingcontextand check nsslapd-defaultnamingcontext become empty.
XFailed suites/export/export_test.py::test_dbtasks_db2ldif_with_non_accessible_ldif_file_path_output 3.62
topo = <lib389.topologies.TopologyMain object at 0x7f61d159fa30>

@pytest.mark.bz1860291
@pytest.mark.xfail(reason="bug 1860291")
@pytest.mark.skipif(ds_is_older("1.3.10", "1.4.2"), reason="Not implemented")
def test_dbtasks_db2ldif_with_non_accessible_ldif_file_path_output(topo):
"""Export with db2ldif, giving a ldif file path which can't be accessed by the user (dirsrv by default)

:id: fcc63387-e650-40a7-b643-baa68c190037
:setup: Standalone Instance - entries imported in the db
:steps:
1. Stop the server
2. Launch db2ldif with a non accessible ldif file path
3. check the error reported in the command output
:expected results:
1. Operation successful
2. Operation properly fails
3. An clear error message is reported as output of the cli
"""
export_ldif = '/tmp/nonexistent/export.ldif'

log.info("Stopping the instance...")
topo.standalone.stop()

log.info("Performing an offline export to a non accessible ldif file path - should fail and output a clear error message")
expected_output="No such file or directory"
> run_db2ldif_and_clear_logs(topo, topo.standalone, DEFAULT_BENAME, export_ldif, expected_output)

suites/export/export_test.py:150:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <lib389.topologies.TopologyMain object at 0x7f61d159fa30>
instance = <lib389.DirSrv object at 0x7f61d15b03a0>, backend = 'userRoot'
ldif = '/tmp/nonexistent/export.ldif', output_msg = 'No such file or directory'
encrypt = False, repl = False

def run_db2ldif_and_clear_logs(topology, instance, backend, ldif, output_msg, encrypt=False, repl=False):
args = FakeArgs()
args.instance = instance.serverid
args.backend = backend
args.encrypted = encrypt
args.replication = repl
args.ldif = ldif

dbtasks_db2ldif(instance, topology.logcap.log, args)

log.info('checking output msg')
if not topology.logcap.contains(output_msg):
log.error('The output message is not the expected one')
> assert False
E assert False

suites/export/export_test.py:36: AssertionError
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:145 Stopping the instance... INFO  lib389.utils:export_test.py:148 Performing an offline export to a non accessible ldif file path - should fail and output a clear error message CRITICAL LogCapture:dbtasks.py:40 db2ldif failed INFO  lib389.utils:export_test.py:33 checking output msg ERROR  lib389.utils:export_test.py:35 The output message is not the expected one
XFailed suites/healthcheck/healthcheck_test.py::test_healthcheck_unable_to_query_backend 1.81
topology_st = <lib389.topologies.TopologyMain object at 0x7f61c3f5ce50>

@pytest.mark.ds50873
@pytest.mark.bz1796343
@pytest.mark.skipif(ds_is_older("1.4.1"), reason="Not implemented")
@pytest.mark.xfail(reason="Will fail because of bz1837315. Set proper version after bug is fixed")
def test_healthcheck_unable_to_query_backend(topology_st):
"""Check if HealthCheck returns DSBLE0002 code

:id: 716b1ff1-94bd-4780-98b8-96ff8ef21e30
:setup: Standalone instance
:steps:
1. Create DS instance
2. Create a new root suffix and database
3. Disable new suffix
4. Use HealthCheck without --json option
5. Use HealthCheck with --json option
:expectedresults:
1. Success
2. Success
3. Success
4. HealthCheck should return code DSBLE0002
5. HealthCheck should return code DSBLE0002
"""

RET_CODE = 'DSBLE0002'
NEW_SUFFIX = 'dc=test,dc=com'
NEW_BACKEND = 'userData'

standalone = topology_st.standalone

log.info('Create new suffix')
backends = Backends(standalone)
backends.create(properties={
'cn': NEW_BACKEND,
'nsslapd-suffix': NEW_SUFFIX,
})

log.info('Disable the newly created suffix')
mts = MappingTrees(standalone)
mt_new = mts.get(NEW_SUFFIX)
mt_new.replace('nsslapd-state', 'disabled')

run_healthcheck_and_flush_log(topology_st, standalone, RET_CODE, json=False)
run_healthcheck_and_flush_log(topology_st, standalone, RET_CODE, json=True)

log.info('Enable the suffix again and check if nothing is broken')
mt_new.replace('nsslapd-state', 'backend')
> run_healthcheck_and_flush_log(topology_st, standalone, RET_CODE, json=False)

suites/healthcheck/healthcheck_test.py:453:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topology = <lib389.topologies.TopologyMain object at 0x7f61c3f5ce50>
instance = <lib389.DirSrv object at 0x7f61d0a79850>, searched_code = 'DSBLE0002'
json = False, searched_code2 = None, list_checks = False, list_errors = False
check = None, searched_list = None

def run_healthcheck_and_flush_log(topology, instance, searched_code=None, json=False, searched_code2=None,
list_checks=False, list_errors=False, check=None, searched_list=None):
args = FakeArgs()
args.instance = instance.serverid
args.verbose = instance.verbose
args.list_errors = list_errors
args.list_checks = list_checks
args.check = check
args.dry_run = False
args.json = json

log.info('Use healthcheck with --json == {} option'.format(json))
health_check_run(instance, topology.logcap.log, args)

if searched_list is not None:
for item in searched_list:
assert topology.logcap.contains(item)
log.info('Healthcheck returned searched item: %s' % item)
else:
> assert topology.logcap.contains(searched_code)
E AssertionError: assert False
E + where False = <bound method LogCapture.contains of <LogCapture (NOTSET)>>('DSBLE0002')
E + where <bound method LogCapture.contains of <LogCapture (NOTSET)>> = <LogCapture (NOTSET)>.contains
E + where <LogCapture (NOTSET)> = <lib389.topologies.TopologyMain object at 0x7f61c3f5ce50>.logcap

suites/healthcheck/healthcheck_test.py:49: AssertionError
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userdata:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userdata:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userdata:search ... INFO  LogCapture:health.py:99 Checking backends:userdata:virt_attrs ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 4 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSBLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: backends:userdata:mappingtree INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- userdata INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 This backend may be missing the correct mapping tree references. Mapping Trees allow the directory server to determine which backend an operation is routed to in the abscence of other information. This is extremely important for correct functioning of LDAP ADD for example. A correct Mapping tree for this backend must contain the suffix name, the database name and be a backend type. IE: cn=o3Dexample,cn=mapping tree,cn=config cn: o=example nsslapd-backend: userRoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Either you need to create the mapping tree, or you need to repair the related mapping tree. You will need to do this by hand by editing cn=config, or stopping the instance and editing dse.ldif. INFO  LogCapture:health.py:45 [2] DS Lint Error: DSBLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: backends:userdata:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=test,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\n'}) INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Check the server's error and access logs for more information. INFO  LogCapture:health.py:45 [3] DS Lint Error: DSBLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: backends:userdata:mappingtree INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- userdata INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 This backend may be missing the correct mapping tree references. Mapping Trees allow the directory server to determine which backend an operation is routed to in the abscence of other information. This is extremely important for correct functioning of LDAP ADD for example. A correct Mapping tree for this backend must contain the suffix name, the database name and be a backend type. IE: cn=o3Dexample,cn=mapping tree,cn=config cn: o=example nsslapd-backend: userRoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Either you need to create the mapping tree, or you need to repair the related mapping tree. You will need to do this by hand by editing cn=config, or stopping the instance and editing dse.ldif. INFO  LogCapture:health.py:45 [4] DS Lint Error: DSBLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: backends:userdata:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=test,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\n'}) INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Check the server's error and access logs for more information. INFO  LogCapture:health.py:124 ===== End Of Report (4 Issues found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSBLE0001", "severity": "MEDIUM", "description": "Possibly incorrect mapping tree.", "items": [ "userdata" ], "detail": "This backend may be missing the correct mapping tree references. Mapping Trees allow\nthe directory server to determine which backend an operation is routed to in the\nabscence of other information. This is extremely important for correct functioning\nof LDAP ADD for example.\n\nA correct Mapping tree for this backend must contain the suffix name, the database name\nand be a backend type. IE:\n\ncn=o3Dexample,cn=mapping tree,cn=config\ncn: o=example\nnsslapd-backend: userRoot\nnsslapd-state: backend\nobjectClass: top\nobjectClass: extensibleObject\nobjectClass: nsMappingTree\n\n", "fix": "Either you need to create the mapping tree, or you need to repair the related\nmapping tree. You will need to do this by hand by editing cn=config, or stopping\nthe instance and editing dse.ldif.\n", "check": "backends:userdata:mappingtree" }, { "dsle": "DSBLE0002", "severity": "HIGH", "description": "Unable to query backend.", "items": [ "dc=test,dc=com" ], "detail": "Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\\n'})", "fix": "Check the server's error and access logs for more information.", "check": "backends:userdata:search" }, { "dsle": "DSBLE0001", "severity": "MEDIUM", "description": "Possibly incorrect mapping tree.", "items": [ "userdata" ], "detail": "This backend may be missing the correct mapping tree references. Mapping Trees allow\nthe directory server to determine which backend an operation is routed to in the\nabscence of other information. This is extremely important for correct functioning\nof LDAP ADD for example.\n\nA correct Mapping tree for this backend must contain the suffix name, the database name\nand be a backend type. IE:\n\ncn=o3Dexample,cn=mapping tree,cn=config\ncn: o=example\nnsslapd-backend: userRoot\nnsslapd-state: backend\nobjectClass: top\nobjectClass: extensibleObject\nobjectClass: nsMappingTree\n\n", "fix": "Either you need to create the mapping tree, or you need to repair the related\nmapping tree. You will need to do this by hand by editing cn=config, or stopping\nthe instance and editing dse.ldif.\n", "check": "backends:userdata:mappingtree" }, { "dsle": "DSBLE0002", "severity": "HIGH", "description": "Unable to query backend.", "items": [ "dc=test,dc=com" ], "detail": "Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\\n'})", "fix": "Check the server's error and access logs for more information.", "check": "backends:userdata:search" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userdata:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userdata:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userdata:search ... INFO  LogCapture:health.py:99 Checking backends:userdata:virt_attrs ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 2 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSBLE0003 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userdata:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=test,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The backend database has not been initialized yet INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You need to import an LDIF file, or create the suffix entry, in order to initialize the database. INFO  LogCapture:health.py:45 [2] DS Lint Error: DSBLE0003 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userdata:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=test,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The backend database has not been initialized yet INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You need to import an LDIF file, or create the suffix entry, in order to initialize the database. INFO  LogCapture:health.py:124 ===== End Of Report (2 Issues found) =====
XFailed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_memberof_groups 0.00
self = <tests.suites.replication.conflict_resolve_test.TestTwoMasters object at 0x7f61c3073670>
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2f65f40>
base_m2 = <lib389.idm.nscontainer.nsContainer object at 0x7f61c33d6d00>

def test_memberof_groups(self, topology_m2, base_m2):
"""Check that conflict properly resolved for operations
with memberOf and groups

:id: 77f09b18-03d1-45da-940b-1ad2c2908eb3
:setup: Two master replication, test container for entries, enable plugin logging,
audit log, error log for replica and access log for internal
:steps:
1. Enable memberOf plugin
2. Add 30 users to m1 and wait for replication to happen
3. Pause replication
4. Create a group on m1 and m2
5. Create a group on m1 and m2, delete from m1
6. Create a group on m1, delete from m1, and create on m2,
7. Create a group on m2 and m1, delete from m1
8. Create two different groups on m2
9. Resume replication
10. Check that the entries on both masters are the same and replication is working
:expectedresults:
1. It should pass
2. It should pass
3. It should pass
4. It should pass
5. It should pass
6. It should pass
7. It should pass
8. It should pass
9. It should pass
10. It should pass
"""

> pytest.xfail("Issue 49591 - work in progress")
E _pytest.outcomes.XFailed: Issue 49591 - work in progress

suites/replication/conflict_resolve_test.py:402: XFailed
XFailed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_managed_entries 0.00
self = <tests.suites.replication.conflict_resolve_test.TestTwoMasters object at 0x7f61c33c73d0>
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2f65f40>

def test_managed_entries(self, topology_m2):
"""Check that conflict properly resolved for operations
with managed entries

:id: 77f09b18-03d1-45da-940b-1ad2c2908eb4
:setup: Two master replication, test container for entries, enable plugin logging,
audit log, error log for replica and access log for internal
:steps:
1. Create ou=managed_users and ou=managed_groups under test container
2. Configure managed entries plugin and add a template to test container
3. Add a user to m1 and wait for replication to happen
4. Pause replication
5. Create a user on m1 and m2 with a same group ID on both master
6. Create a user on m1 and m2 with a different group ID on both master
7. Resume replication
8. Check that the entries on both masters are the same and replication is working
:expectedresults:
1. It should pass
2. It should pass
3. It should pass
4. It should pass
5. It should pass
6. It should pass
7. It should pass
8. It should pass
"""

> pytest.xfail("Issue 49591 - work in progress")
E _pytest.outcomes.XFailed: Issue 49591 - work in progress

suites/replication/conflict_resolve_test.py:493: XFailed
XFailed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_nested_entries_with_children 0.00
self = <tests.suites.replication.conflict_resolve_test.TestTwoMasters object at 0x7f61c32a3d60>
topology_m2 = <lib389.topologies.TopologyMain object at 0x7f61c2f65f40>
base_m2 = <lib389.idm.nscontainer.nsContainer object at 0x7f61c3344250>

def test_nested_entries_with_children(self, topology_m2, base_m2):
"""Check that conflict properly resolved for operations
with nested entries with children

:id: 77f09b18-03d1-45da-940b-1ad2c2908eb5
:setup: Two master replication, test container for entries, enable plugin logging,
audit log, error log for replica and access log for internal
:steps:
1. Add 15 containers to m1 and wait for replication to happen
2. Pause replication
3. Create parent-child on master2 and master1
4. Create parent-child on master1 and master2
5. Create parent-child on master1 and master2 different child rdn
6. Create parent-child on master1 and delete parent on master2
7. Create parent on master1, delete it and parent-child on master2, delete them
8. Create parent on master1, delete it and parent-two children on master2
9. Create parent-two children on master1 and parent-child on master2, delete them
10. Create three subsets inside existing container entry, applying only part of changes on m2
11. Create more combinations of the subset with parent-child on m1 and parent on m2
12. Delete container on m1, modify user1 on m1, create parent on m2 and modify user2 on m2
13. Resume replication
14. Check that the entries on both masters are the same and replication is working
:expectedresults:
1. It should pass
2. It should pass
3. It should pass
4. It should pass
5. It should pass
6. It should pass
7. It should pass
8. It should pass
9. It should pass
10. It should pass
11. It should pass
12. It should pass
13. It should pass
14. It should pass
"""

> pytest.xfail("Issue 49591 - work in progress")
E _pytest.outcomes.XFailed: Issue 49591 - work in progress

suites/replication/conflict_resolve_test.py:584: XFailed
XFailed suites/replication/conflict_resolve_test.py::TestThreeMasters::test_nested_entries 0.00
self = <tests.suites.replication.conflict_resolve_test.TestThreeMasters object at 0x7f61c33767f0>
topology_m3 = <lib389.topologies.TopologyMain object at 0x7f61c33766d0>
base_m3 = <lib389.idm.nscontainer.nsContainer object at 0x7f61c32eabb0>

def test_nested_entries(self, topology_m3, base_m3):
"""Check that conflict properly resolved for operations
with nested entries with children

:id: 77f09b18-03d1-45da-940b-1ad2c2908eb6
:setup: Three master replication, test container for entries, enable plugin logging,
audit log, error log for replica and access log for internal
:steps:
1. Add 15 containers to m1 and wait for replication to happen
2. Pause replication
3. Create two child entries under each of two entries
4. Create three child entries under each of three entries
5. Create two parents on m1 and m2, then on m1 - create a child and delete one parent,
on m2 - delete one parent and create a child
6. Test a few more parent-child combinations with three instances
7. Resume replication
8. Check that the entries on both masters are the same and replication is working
:expectedresults:
1. It should pass
2. It should pass
3. It should pass
4. It should pass
5. It should pass
6. It should pass
7. It should pass
8. It should pass
"""

> pytest.xfail("Issue 49591 - work in progress")
E _pytest.outcomes.XFailed: Issue 49591 - work in progress

suites/replication/conflict_resolve_test.py:968: XFailed
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8a67e0d4-54e5-475c-bc2a-942534ef82c3 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0edc2c81-d24c-4816-9e80-107a17289396 / got description=8a67e0d4-54e5-475c-bc2a-942534ef82c3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 630bcc7f-d015-4183-b201-424c587b90e4 / got description=0edc2c81-d24c-4816-9e80-107a17289396) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 27ba1062-0c24-47d9-ae6f-3f4d501edf4e / got description=630bcc7f-d015-4183-b201-424c587b90e4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 27ba1062-0c24-47d9-ae6f-3f4d501edf4e / got description=630bcc7f-d015-4183-b201-424c587b90e4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 27ba1062-0c24-47d9-ae6f-3f4d501edf4e / got description=630bcc7f-d015-4183-b201-424c587b90e4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
XFailed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaPort-0-65535-9999999999999999999999999999999999999999999999999999999999999999999-invalid-389] 0.12
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaPort', too_small = '0', too_big = '65535'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '389'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_add(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf94
:parametrized: yes
:setup: standalone instance
:steps:
1. Use a value that is too small
2. Use a value that is too big
3. Use a value that overflows the int
4. Use a value with character value (not a number)
5. Use a valid value
:expectedresults:
1. Add is rejected
2. Add is rejected
3. Add is rejected
4. Add is rejected
5. Add is allowed
"""

agmt_reset(topo)
replica = replica_setup(topo)

agmts = Agreements(topo.standalone, basedn=replica.dn)

# Test too small
perform_invalid_create(agmts, agmt_dict, attr, too_small)
# Test too big
> perform_invalid_create(agmts, agmt_dict, attr, too_big)

suites/replication/replica_config_test.py:217:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

many = <lib389.agreement.Agreements object at 0x7f61c2f89fa0>
properties = {'cn': 'test_agreement', 'nsDS5ReplicaBindDN': 'uid=tester', 'nsDS5ReplicaBindMethod': 'SIMPLE', 'nsDS5ReplicaHost': 'localhost.localdomain', ...}
attr = 'nsds5ReplicaPort', value = '65535'

def perform_invalid_create(many, properties, attr, value):
my_properties = copy.deepcopy(properties)
my_properties[attr] = value
with pytest.raises(ldap.LDAPError) as ei:
> many.create(properties=my_properties)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:108: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaPort-0-65535-9999999999999999999999999999999999999999999999999999999999999999999-invalid-389] 0.20
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaPort', too_small = '0', too_big = '65535'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '389'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
> perform_invalid_modify(agmt, attr, too_small)

suites/replication/replica_config_test.py:253:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c3069d00>
attr = 'nsds5ReplicaPort', value = '0'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.21
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaTimeout', too_small = '-1', too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c3267190>
attr = 'nsds5ReplicaTimeout', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaBusyWaitTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.20
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaBusyWaitTime', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c2f43550>
attr = 'nsds5ReplicaBusyWaitTime', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaSessionPauseTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.20
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaSessionPauseTime', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c324daf0>
attr = 'nsds5ReplicaSessionPauseTime', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaFlowControlWindow--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.22
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaFlowControlWindow', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c3279a60>
attr = 'nsds5ReplicaFlowControlWindow', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaFlowControlPause--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.21
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaFlowControlPause', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c326e7c0>
attr = 'nsds5ReplicaFlowControlPause', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/replica_config_test.py::test_agmt_num_modify[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.23
topo = <lib389.topologies.TopologyMain object at 0x7f61c2fbd1c0>
attr = 'nsds5ReplicaProtocolTimeout', too_small = '-1'
too_big = '9223372036854775807'
overflow = '9999999999999999999999999999999999999999999999999999999999999999999'
notnum = 'invalid', valid = '6'

@pytest.mark.xfail(reason="Agreement validation current does not work.")
@pytest.mark.parametrize("attr, too_small, too_big, overflow, notnum, valid", agmt_attrs)
def test_agmt_num_modify(topo, attr, too_small, too_big, overflow, notnum, valid):
"""Test all the number values you can set for a replica config entry

:id: a8b47d4a-a089-4d70-8070-e6181209bf95
:parametrized: yes
:setup: standalone instance
:steps:
1. Replace a value that is too small
2. Replace a value that is too big
3. Replace a value that overflows the int
4. Replace a value with character value (not a number)
5. Replace a vlue with a valid value
:expectedresults:
1. Value is rejected
2. Value is rejected
3. Value is rejected
4. Value is rejected
5. Value is allowed
"""

agmt = agmt_setup(topo)

# Value too small
perform_invalid_modify(agmt, attr, too_small)
# Value too big
> perform_invalid_modify(agmt, attr, too_big)

suites/replication/replica_config_test.py:255:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

o = <lib389.agreement.Agreement object at 0x7f61c32750a0>
attr = 'nsds5ReplicaProtocolTimeout', value = '9223372036854775807'

def perform_invalid_modify(o, attr, value):
with pytest.raises(ldap.LDAPError) as ei:
> o.replace(attr, value)
E Failed: DID NOT RAISE <class 'ldap.LDAPError'>

suites/replication/replica_config_test.py:113: Failed
XFailed suites/replication/ruvstore_test.py::test_memoryruv_sync_with_databaseruv 0.27
topo = <lib389.topologies.TopologyMain object at 0x7f61d5689dc0>

@pytest.mark.xfail(reason="No method to safety access DB ruv currently exists online.")
def test_memoryruv_sync_with_databaseruv(topo):
"""Check if memory ruv and database ruv are synced

:id: 5f38ac5f-6353-460d-bf60-49cafffda5b3
:setup: Replication with two masters.
:steps: 1. Add user to server and compare memory ruv and database ruv.
2. Modify description of user and compare memory ruv and database ruv.
3. Modrdn of user and compare memory ruv and database ruv.
4. Delete user and compare memory ruv and database ruv.
:expectedresults:
1. For add user, the memory ruv and database ruv should be the same.
2. For modify operation, the memory ruv and database ruv should be the same.
3. For modrdn operation, the memory ruv and database ruv should be the same.
4. For delete operation, the memory ruv and database ruv should be the same.
"""

log.info('Adding user: {} to master1'.format(TEST_ENTRY_NAME))
users = UserAccounts(topo.ms['master1'], DEFAULT_SUFFIX)
tuser = users.create(properties=USER_PROPERTIES)
> _compare_memoryruv_and_databaseruv(topo, 'add')

suites/replication/ruvstore_test.py:139:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

topo = <lib389.topologies.TopologyMain object at 0x7f61d5689dc0>
operation_type = 'add'

def _compare_memoryruv_and_databaseruv(topo, operation_type):
"""Compare the memoryruv and databaseruv for ldap operations"""

log.info('Checking memory ruv for ldap: {} operation'.format(operation_type))
replicas = Replicas(topo.ms['master1'])
replica = replicas.list()[0]
memory_ruv = replica.get_attr_val_utf8('nsds50ruv')

log.info('Checking database ruv for ldap: {} operation'.format(operation_type))
> entry = replicas.get_ruv_entry(DEFAULT_SUFFIX)
E AttributeError: 'Replicas' object has no attribute 'get_ruv_entry'

suites/replication/ruvstore_test.py:81: AttributeError
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:136 Adding user: rep2lusr to master1 INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:75 Checking memory ruv for ldap: add operation INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:80 Checking database ruv for ldap: add operation
XPassed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_Use_double_equal_instead_of_equal_in_the_targetattr] 0.04
No log output captured.
XPassed suites/acl/syntax_test.py::test_aci_invalid_syntax_fail[test_Use_double_equal_instead_of_equal_in_the_targetfilter] 0.03
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.27
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaBusyWaitTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.30
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaSessionPauseTime--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.25
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaFlowControlWindow--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.26
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaFlowControlPause--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.31
No log output captured.
XPassed suites/replication/replica_config_test.py::test_agmt_num_add[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.26
No log output captured.
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_config::setup 0.00
('suites/auth_token/basic_auth_test.py', 28, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_nsuser::setup 0.00
('suites/auth_token/basic_auth_test.py', 75, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_disabled::setup 0.00
('suites/auth_token/basic_auth_test.py', 144, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_directory_manager::setup 0.00
('suites/auth_token/basic_auth_test.py', 194, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/auth_token/basic_auth_test.py::test_ldap_auth_token_anonymous::setup 0.00
('suites/auth_token/basic_auth_test.py', 217, 'Skipped: Auth tokens are not available in older versions')
Skipped suites/config/regression_test.py::test_set_cachememsize_to_custom_value::setup 0.00
('suites/config/regression_test.py', 34, 'Skipped: available memory is too low')
Skipped suites/ds_logs/ds_logs_test.py::test_etime_at_border_of_second::setup 0.00
('suites/ds_logs/ds_logs_test.py', 735, 'Skipped: rsearch was removed')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_indexed_import_and_search::setup 0.00
('suites/entryuuid/basic_test.py', 73, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_unindexed_import_and_search::setup 0.00
('suites/entryuuid/basic_test.py', 113, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_generation_on_add::setup 0.00
('suites/entryuuid/basic_test.py', 155, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/entryuuid/basic_test.py::test_entryuuid_fixup_task::setup 0.00
('suites/entryuuid/basic_test.py', 179, 'Skipped: Entryuuid is not available in older versions')
Skipped suites/memory_leaks/MMR_double_free_test.py::test_MMR_double_free::setup 0.00
('suites/memory_leaks/MMR_double_free_test.py', 67, "Skipped: Don't run if ASAN is not enabled")
Skipped suites/memory_leaks/range_search_test.py::test_range_search::setup 0.00
('suites/memory_leaks/range_search_test.py', 24, "Skipped: Don't run if ASAN is not enabled")
Skipped suites/migration/export_data_test.py::test_export_data_from_source_host::setup 0.00
('suites/migration/export_data_test.py', 24, 'Skipped: This test is meant to execute in specific test environment')
Skipped suites/migration/import_data_test.py::test_import_data_to_target_host::setup 0.00
('suites/migration/import_data_test.py', 24, 'Skipped: This test is meant to execute in specific test environment')
Skipped suites/replication/changelog_test.py::test_cldump_files_removed::setup 0.00
('suites/replication/changelog_test.py', 235, 'Skipped: does not work for prefix builds')
Skipped suites/replication/changelog_test.py::test_changelog_compactdbinterval::setup 0.00
('suites/replication/changelog_test.py', 630, 'Skipped: changelog compaction is done by the backend itself, with id2entry as well, nsslapd-changelogcompactdb-interval is no longer supported')
Skipped suites/rewriters/adfilter_test.py::test_adfilter_objectSid::setup 0.00
('suites/rewriters/adfilter_test.py', 90, 'Skipped: It is missing samba python bindings')
Skipped tickets/ticket47462_test.py::test_ticket47462::setup 0.00
('tickets/ticket47462_test.py', 39, 'Skipped: Upgrade scripts are supported only on versions < 1.4.x')
Skipped tickets/ticket47815_test.py::test_ticket47815::setup 0.00
('tickets/ticket47815_test.py', 26, 'Skipped: Not implemented, or invalid by nsMemberOf')
Skipped tickets/ticket49121_test.py::test_ticket49121::setup 0.00
('tickets/ticket49121_test.py', 32, "Skipped: Don't run if ASAN is not enabled")
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_EQ_ACI)] 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_PRES_ACI)] 0.04
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, REAL_SUB_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, ROLE_PRES_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, ROLE_SUB_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_EQ_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_PRES_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, COS_SUB_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_positive[(ENG_USER, ENG_MANAGER, LDAPURL_ACI)] 0.35
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, REAL_EQ_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_OU, REAL_PRES_ACI)] 0.05
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, REAL_SUB_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_EQ_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_PRES_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, ROLE_SUB_ACI)] 0.28
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_EQ_ACI)] 0.07
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_PRES_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, SALES_MANAGER, COS_SUB_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(SALES_UESER, SALES_MANAGER, LDAPURL_ACI)] 0.06
No log output captured.
Passed suites/acl/acivattr_test.py::test_negative[(ENG_USER, ENG_MANAGER, ROLE_EQ_ACI)] 0.06
No log output captured.
Passed suites/acl/acl_deny_test.py::test_multi_deny_aci 11.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389:acl_deny_test.py:47 Add uid=tuser1,ou=People,dc=example,dc=com INFO  lib389:acl_deny_test.py:58 Add uid=tuser,ou=People,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_deny_test.py:90 Pass 1 INFO  lib389:acl_deny_test.py:93 Testing two searches behave the same... INFO  lib389:acl_deny_test.py:136 Testing search does not return any entries... INFO  lib389:acl_deny_test.py:90 Pass 2 INFO  lib389:acl_deny_test.py:93 Testing two searches behave the same... INFO  lib389:acl_deny_test.py:136 Testing search does not return any entries... INFO  lib389:acl_deny_test.py:200 Test PASSED
Passed suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[lang-ja] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 32822820-46e5-494a-b22b-607804f0350c / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect ef0e84ff-2c1a-434b-a7d5-983e407e6274 / got description=32822820-46e5-494a-b22b-607804f0350c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  tests.suites.acl.acl_test:acl_test.py:76 ========Executing test with 'lang-ja' subtype======== INFO  tests.suites.acl.acl_test:acl_test.py:77 Add a target attribute INFO  tests.suites.acl.acl_test:acl_test.py:80 Add a user attribute INFO  tests.suites.acl.acl_test:acl_test.py:88 Add an ACI with attribute subtype
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.acl_test:acl_test.py:118 Search for the added attribute INFO  tests.suites.acl.acl_test:acl_test.py:125 The added attribute was found
Passed suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[binary] 0.00
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.acl.acl_test:acl_test.py:76 ========Executing test with 'binary' subtype======== INFO  tests.suites.acl.acl_test:acl_test.py:77 Add a target attribute INFO  tests.suites.acl.acl_test:acl_test.py:80 Add a user attribute INFO  tests.suites.acl.acl_test:acl_test.py:88 Add an ACI with attribute subtype
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.acl_test:acl_test.py:118 Search for the added attribute INFO  tests.suites.acl.acl_test:acl_test.py:125 The added attribute was found
Passed suites/acl/acl_test.py::test_aci_attr_subtype_targetattr[phonetic] 0.00
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.acl.acl_test:acl_test.py:76 ========Executing test with 'phonetic' subtype======== INFO  tests.suites.acl.acl_test:acl_test.py:77 Add a target attribute INFO  tests.suites.acl.acl_test:acl_test.py:80 Add a user attribute INFO  tests.suites.acl.acl_test:acl_test.py:88 Add an ACI with attribute subtype
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.acl_test:acl_test.py:118 Search for the added attribute INFO  tests.suites.acl.acl_test:acl_test.py:125 The added attribute was found
Passed suites/acl/acl_test.py::test_mode_default_add_deny 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389:acl_test.py:233 ######## INITIALIZATION ######## INFO  lib389:acl_test.py:236 Add uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:254 Add cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:258 Add cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:262 Add cn=excepts,cn=accounts,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:294 ######## mode moddn_aci : ADD (should fail) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:302 Try to add cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:311 Exception (expected): INSUFFICIENT_ACCESS
Passed suites/acl/acl_test.py::test_mode_default_delete_deny 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:329 ######## DELETE (should fail) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:336 Try to delete cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:341 Exception (expected): INSUFFICIENT_ACCESS
Passed suites/acl/acl_test.py::test_moddn_staging_prod[0-cn=staged user,dc=example,dc=com-cn=accounts,dc=example,dc=com-False] 0.23
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (0) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account0,cn=staged user,dc=example,dc=com -> uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account0,cn=staged user,dc=example,dc=com -> uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[1-cn=staged user,dc=example,dc=com-cn=accounts,dc=example,dc=com-False] 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (1) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account1,cn=staged user,dc=example,dc=com -> uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account1,cn=staged user,dc=example,dc=com -> uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[2-cn=staged user,dc=example,dc=com-cn=bad*,dc=example,dc=com-True] 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (2) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account2,cn=staged user,dc=example,dc=com -> uid=new_account2,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account2,cn=staged user,dc=example,dc=com -> uid=new_account2,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:409 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[3-cn=st*,dc=example,dc=com-cn=accounts,dc=example,dc=com-False] 0.18
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (3) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account3,cn=staged user,dc=example,dc=com -> uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account3,cn=staged user,dc=example,dc=com -> uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[4-cn=bad*,dc=example,dc=com-cn=accounts,dc=example,dc=com-True] 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (4) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account4,cn=staged user,dc=example,dc=com -> uid=new_account4,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account4,cn=staged user,dc=example,dc=com -> uid=new_account4,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:409 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[5-cn=st*,dc=example,dc=com-cn=ac*,dc=example,dc=com-False] 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (5) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account5,cn=staged user,dc=example,dc=com -> uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account5,cn=staged user,dc=example,dc=com -> uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[6-None-cn=ac*,dc=example,dc=com-False] 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (6) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account6,cn=staged user,dc=example,dc=com -> uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account6,cn=staged user,dc=example,dc=com -> uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[7-cn=st*,dc=example,dc=com-None-False] 0.18
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (7) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account7,cn=staged user,dc=example,dc=com -> uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account7,cn=staged user,dc=example,dc=com -> uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod[8-None-None-False] 0.16
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:376 ######## MOVE staging -> Prod (8) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:388 Try to MODDN uid=new_account8,cn=staged user,dc=example,dc=com -> uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:395 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:399 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:406 Try to MODDN uid=new_account8,cn=staged user,dc=example,dc=com -> uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_staging_prod_9 0.71
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:453 ######## MOVE staging -> Prod (9) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:466 Try to MODDN uid=new_account9,cn=staged user,dc=example,dc=com -> uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:473 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:479 Disable the moddn right INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:484 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:492 Try to MODDN uid=new_account9,cn=staged user,dc=example,dc=com -> uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:499 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:521 Try to MODDN uid=new_account9,cn=staged user,dc=example,dc=com -> uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:532 Enable the moddn right INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:536 ######## MOVE staging -> Prod (10) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:548 Try to MODDN uid=new_account10,cn=staged user,dc=example,dc=com -> uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:555 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:572 Try to MODDN uid=new_account10,cn=staged user,dc=example,dc=com -> uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:579 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:588 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:594 Try to MODDN uid=new_account10,cn=staged user,dc=example,dc=com -> uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_moddn_prod_staging 0.32
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:623 ######## MOVE staging -> Prod (11) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:636 Try to MODDN uid=new_account11,cn=staged user,dc=example,dc=com -> uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:643 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:647 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:653 Try to MODDN uid=new_account11,cn=staged user,dc=example,dc=com -> uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:668 Try to move back MODDN uid=new_account11,cn=accounts,dc=example,dc=com -> uid=new_account11,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:675 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_check_repl_M2_to_M1 1.04
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:705 Bind as cn=Directory Manager (M2) INFO  lib389:acl_test.py:725 Update (M2) uid=new_account12,cn=staged user,dc=example,dc=com (description) INFO  lib389:acl_test.py:738 Update uid=new_account12,cn=staged user,dc=example,dc=com (description) replicated on M1
Passed suites/acl/acl_test.py::test_moddn_staging_prod_except 0.41
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:763 ######## MOVE staging -> Prod (13) ######## INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:775 Try to MODDN uid=new_account13,cn=staged user,dc=example,dc=com -> uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:782 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:786 ######## MOVE to and from equality filter ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:160 Add a DENY aci under cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:793 Try to MODDN uid=new_account13,cn=staged user,dc=example,dc=com -> uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:799 ######## MOVE staging -> Prod/Except (14) ######## INFO  lib389:acl_test.py:805 Try to MODDN uid=new_account14,cn=staged user,dc=example,dc=com -> uid=new_account14,cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:812 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:160 Add a DENY aci under cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_mode_default_ger_no_moddn 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:839 ######## mode moddn_aci : GER no moddn ######## INFO  lib389:acl_test.py:850 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:850 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:853 ######## entryLevelRights: b'v'
Passed suites/acl/acl_test.py::test_mode_default_ger_with_moddn 0.16
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:877 ######## mode moddn_aci: GER with moddn ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:895 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:898 ######## entryLevelRights: b'vn' INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_mode_legacy_ger_no_moddn1 0.04
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:928 ######## Disable the moddn aci mod ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:932 ######## mode legacy 1: GER no moddn ######## INFO  lib389:acl_test.py:942 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:942 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:945 ######## entryLevelRights: b'v'
Passed suites/acl/acl_test.py::test_mode_legacy_ger_no_moddn2 0.34
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:971 ######## Disable the moddn aci mod ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:975 ######## mode legacy 2: GER no moddn ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:992 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:995 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com
Passed suites/acl/acl_test.py::test_mode_legacy_ger_with_moddn 0.11
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:1031 ######## Disable the moddn aci mod ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:1035 ######## mode legacy : GER with moddn ######## INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager INFO  lib389:acl_test.py:139 Bind as uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1057 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1060 ######## entryLevelRights: b'vn' INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager
Passed suites/acl/acl_test.py::test_rdn_write_get_ger 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389:acl_test.py:1071 ######## Add entry tuser ########
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:1097 ######## GER rights for anonymous ######## INFO  lib389:acl_test.py:1107 dn: dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: ou=groups,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: ou=people,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: ou=services,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=demo_user,ou=people,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=demo_group,ou=groups,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=group_admin,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=group_modify,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=user_admin,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=user_modify,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=user_passwd_reset,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=user_private_read,ou=permissions,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=replication_managers,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702,ou=services,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=bind_entry,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=excepts,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account0,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account1,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account2,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account3,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account4,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account5,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account6,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account7,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account8,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account9,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account10,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account11,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account12,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account13,cn=accounts,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account14,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account15,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account16,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account17,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account18,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: uid=new_account19,cn=staged user,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v' INFO  lib389:acl_test.py:1107 dn: cn=tuser,dc=example,dc=com INFO  lib389:acl_test.py:1109 ######## entryLevelRights: b'v'
Passed suites/acl/acl_test.py::test_rdn_write_modrdn_anonymous 0.06
-------------------------------Captured log call--------------------------------
INFO  lib389:acl_test.py:1136 dn: INFO  lib389:acl_test.py:1138 ######## 'objectClass': [b'top'] INFO  lib389:acl_test.py:1138 ######## 'defaultnamingcontext': [b'dc=example,dc=com'] INFO  lib389:acl_test.py:1138 ######## 'dataversion': [b'020201031001729'] INFO  lib389:acl_test.py:1138 ######## 'netscapemdsuffix': [b'cn=ldap://dc=localhost,dc=localdomain:39001'] INFO  lib389:acl_test.py:1143 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:acl_test.py:1150 The entry was not renamed (expected) INFO  lib389:acl_test.py:133 Bind as cn=Directory Manager
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_groupdn 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/deladd_test.py::test_allow_add_access_to_anyone 0.06
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_anyone 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_userdn 0.06
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_group 0.29
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_add_access_to_parent 0.07
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_parent 0.08
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_dynamic_group 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_to_dynamic_group_uid 0.05
No log output captured.
Passed suites/acl/deladd_test.py::test_allow_delete_access_not_to_dynamic_group 0.09
No log output captured.
Passed suites/acl/enhanced_aci_modrnd_test.py::test_enhanced_aci_modrnd 0.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:32 Add a container: ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:38 Add a container: ou=test_ou_2,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:44 Add a user: cn=test_user,ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:58 Add an ACI 'allow (all)' by cn=test_user,ou=test_ou_1,dc=example,dc=com to the ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:62 Add an ACI 'allow (all)' by cn=test_user,ou=test_ou_1,dc=example,dc=com to the ou=test_ou_2,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:93 Bind as cn=test_user,ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:97 User MODRDN operation from ou=test_ou_1,dc=example,dc=com to ou=test_ou_2,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:103 Check there is no user in ou=test_ou_1,dc=example,dc=com INFO  tests.suites.acl.enhanced_aci_modrnd_test:enhanced_aci_modrnd_test.py:109 Check there is our user in ou=test_ou_2,dc=example,dc=com
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_five 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_six 0.06
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_seven 0.04
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eight 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_nine 0.04
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_ten 0.05
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eleven 0.04
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_twelve 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_fourteen 0.07
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_fifteen 0.05
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_sixteen 0.03
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_seventeen 0.02
No log output captured.
Passed suites/acl/globalgroup_part2_test.py::test_undefined_in_group_eval_eighteen 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_caching_changes 0.31
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/globalgroup_test.py::test_deny_group_member_all_rights_to_user 0.07
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deny_group_member_all_rights_to_group_members 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_denial 0.06
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_denial_two 0.02
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_allow 0.02
No log output captured.
Passed suites/acl/globalgroup_test.py::test_deeply_nested_groups_aci_allow_two 0.04
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval 0.04
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_two 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_three 0.03
No log output captured.
Passed suites/acl/globalgroup_test.py::test_undefined_in_group_eval_four 0.06
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_ip_keyword_test_noip_cannot 0.11
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_at_any_time 0.09
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_only_in_the_morning 0.12
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_can_access_the_data_only_in_the_afternoon 0.11
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_timeofday_keyword 1.17
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_dayofweek_keyword_test_everyday_can_access 0.10
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_dayofweek_keyword_today_can_access 0.08
No log output captured.
Passed suites/acl/keywords_part2_test.py::test_user_cannot_access_the_data_at_all 0.09
No log output captured.
Passed suites/acl/keywords_test.py::test_user_binds_with_a_password_and_can_access_the_data 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/keywords_test.py::test_user_binds_with_a_bad_password_and_cannot_access_the_data 0.01
No log output captured.
Passed suites/acl/keywords_test.py::test_anonymous_user_cannot_access_the_data 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_authenticated_but_has_no_rigth_on_the_data 0.06
No log output captured.
Passed suites/acl/keywords_test.py::test_the_bind_client_is_accessing_the_directory 0.01
No log output captured.
Passed suites/acl/keywords_test.py::test_users_binds_with_a_password_and_can_access_the_data 0.01
No log output captured.
Passed suites/acl/keywords_test.py::test_user_binds_without_any_password_and_cannot_access_the_data 0.02
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_any_machine 0.05
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_internal_ds_network_only 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_the_data_when_connecting_from_some_network_only 0.05
No log output captured.
Passed suites/acl/keywords_test.py::test_from_an_unauthorized_network 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_cannot_access_the_data_when_connecting_from_an_unauthorized_network_2 0.03
No log output captured.
Passed suites/acl/keywords_test.py::test_user_cannot_access_the_data_if_not_from_a_certain_domain 0.07
No log output captured.
Passed suites/acl/keywords_test.py::test_dnsalias_keyword_test_nodns_cannot 0.25
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_from_ipv4_or_ipv6_address[127.0.0.1] 0.04
No log output captured.
Passed suites/acl/keywords_test.py::test_user_can_access_from_ipv4_or_ipv6_address[[::1]] 0.02
No log output captured.
Passed suites/acl/misc_test.py::test_accept_aci_in_addition_to_acl 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/misc_test.py::test_more_then_40_acl_will_crash_slapd 0.32
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_search_access_should_not_include_read_access 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_only_allow_some_targetattr 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_only_allow_some_targetattr_two 0.34
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_memberurl_needs_to_be_normalized 0.13
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_greater_than_200_acls_can_be_created 4.97
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_server_bahaves_properly_with_very_long_attribute_names 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/misc_test.py::test_do_bind_as_201_distinct_users 172.23
-------------------------------Captured log setup-------------------------------
INFO  lib389:misc_test.py:76 Exception (expected): ALREADY_EXISTS
Passed suites/acl/modify_test.py::test_allow_write_access_to_targetattr_with_a_single_attribute 0.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/modify_test.py::test_allow_write_access_to_targetattr_with_multiple_attibutes 0.07
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_all 0.11
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_with_wildcards_in_dn 0.06
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdn_with_multiple_dns 0.22
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_target_with_wildcards 0.19
No log output captured.
Passed suites/acl/modify_test.py::test_allow_write_access_to_userdnattr 0.10
No log output captured.
Passed suites/acl/modify_test.py::test_allow_selfwrite_access_to_anyone 0.09
No log output captured.
Passed suites/acl/modify_test.py::test_uniquemember_should_also_be_the_owner 0.24
No log output captured.
Passed suites/acl/modify_test.py::test_aci_with_both_allow_and_deny 0.16
No log output captured.
Passed suites/acl/modify_test.py::test_allow_owner_to_modify_entry 0.11
No log output captured.
Passed suites/acl/modrdn_test.py::test_allow_write_privilege_to_anyone 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/modrdn_test.py::test_allow_write_privilege_to_dynamic_group_with_scope_set_to_base_in_ldap_url 0.03
No log output captured.
Passed suites/acl/modrdn_test.py::test_write_access_to_naming_atributes 0.04
No log output captured.
Passed suites/acl/modrdn_test.py::test_write_access_to_naming_atributes_two 0.11
No log output captured.
Passed suites/acl/modrdn_test.py::test_access_aci_list_contains_any_deny_rule 0.12
No log output captured.
Passed suites/acl/modrdn_test.py::test_renaming_target_entry 0.08
No log output captured.
Passed suites/acl/repeated_ldap_add_test.py::test_repeated_ldap_add 31.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
Entry uid=buser123,ou=BOU,dc=example,dc=com is locked
-------------------------------Captured log call--------------------------------
INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:184 Testing Bug 1347760 - Information disclosure via repeated use of LDAP ADD operation, etc. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:186 Disabling accesslog logbuffering INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:189 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:192 Adding ou=BOU a bind user belongs to. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:197 Adding a bind user. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:204 Adding a test user. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:211 Deleting aci in dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:214 While binding as DM, acquire an access log path and instance dir INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:220 Bind case 1. the bind user has no rights to read the entry itself, bind should be successful. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:221 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} who has no access rights. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:229 Access log path: /var/log/dirsrv/slapd-standalone1/access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:231 Bind case 2-1. the bind user does not exist, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:233 Bind as {uid=bogus,dc=example,dc=com,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:237 Exception (expected): INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:238 Desc Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:246 Cause found - [30/Oct/2020:20:23:12.068704733 -0400] conn=1 op=11 RESULT err=49 tag=97 nentries=0 wtime=0.000101639 optime=0.008046312 etime=0.008145765 - No such entry INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:249 Bind case 2-2. the bind user's suffix does not exist, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:251 Bind as {uid=bogus,ou=people,dc=bogus,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:260 Cause found - [30/Oct/2020:20:23:13.077522648 -0400] conn=1 op=12 RESULT err=49 tag=97 nentries=0 wtime=0.000153497 optime=0.004257464 etime=0.004402734 - No suffix for bind dn found INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:263 Bind case 2-3. the bind user's password is wrong, bind should fail with error INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:265 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,bogus} who does not exist. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:269 Exception (expected): INVALID_CREDENTIALS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:270 Desc Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:278 Cause found - [30/Oct/2020:20:23:14.114496391 -0400] conn=1 op=13 RESULT err=49 tag=97 nentries=0 wtime=0.000180453 optime=0.033104121 etime=0.033277037 - Invalid credentials INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:281 Adding aci for uid=buser123,ou=BOU,dc=example,dc=com to ou=BOU,dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:283 aci: (targetattr="*")(version 3.0; acl "buser123"; allow(all) userdn = "ldap:///uid=buser123,ou=BOU,dc=example,dc=com";) INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:284 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:289 Bind case 3. the bind user has the right to read the entry itself, bind should be successful. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:290 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:293 The following operations are against the subtree the bind user uid=buser123,ou=BOU,dc=example,dc=com has no rights. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:297 Search case 1. the bind user has no rights to read the search entry, it should return no search results with <class 'ldap.SUCCESS'> INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching existing entry uid=tuser0,ou=people,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:303 Search case 2-1. the search entry does not exist, the search should return no search results with SUCCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:309 Search case 2-2. the search entry does not exist, the search should return no search results with SUCCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should be ok. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:141 Search should return none INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:316 Add case 1. the bind user has no rights AND the adding entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:322 Add case 2-1. the bind user has no rights AND the adding entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:328 Add case 2-2. the bind user has no rights AND the adding entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:335 Modify case 1. the bind user has no rights AND the modifying entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:341 Modify case 2-1. the bind user has no rights AND the modifying entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:347 Modify case 2-2. the bind user has no rights AND the modifying entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:354 Modrdn case 1. the bind user has no rights AND the renaming entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:360 Modrdn case 2-1. the bind user has no rights AND the renaming entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:366 Modrdn case 2-2. the bind user has no rights AND the renaming entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:372 Modrdn case 3. the bind user has no rights AND the node moving an entry to exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to existing superior ou=groups,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:378 Modrdn case 4-1. the bind user has no rights AND the node moving an entry to does not, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:384 Modrdn case 4-2. the bind user has no rights AND the node moving an entry to does not, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:391 Delete case 1. the bind user has no rights AND the deleting entry exists, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:397 Delete case 2-1. the bind user has no rights AND the deleting entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:403 Delete case 2-2. the bind user has no rights AND the deleting entry does not exist, it should fail with INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with INSUFFICIENT_ACCESS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): INSUFFICIENT_ACCESS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Insufficient access INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:407 EXTRA: Check no regressions INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:408 Adding aci for uid=buser123,ou=BOU,dc=example,dc=com to dc=example,dc=com. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:410 Bind as {cn=Directory Manager,password} INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:415 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123}. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:425 Search case. the search entry does not exist, the search should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Searching non-existing entry uid=bogus,ou=people,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:431 Add case. the adding entry already exists, it should fail with ALREADY_EXISTS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Adding existing entry uid=tuser0,ou=people,dc=example,dc=com, which should fail with ALREADY_EXISTS. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): ALREADY_EXISTS INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc Already exists INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:436 Modify case. the modifying entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Modifying non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:441 Modrdn case 1. the renaming entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Renaming non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:446 Modrdn case 2. the node moving an entry to does not, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Moving to non-existing superior ou=OU,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:451 Delete case. the deleting entry does not exist, it should fail with NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:108 Deleting non-existing entry uid=bogus,dc=example,dc=com, which should fail with NO_SUCH_OBJECT. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:131 Exception (expected): NO_SUCH_OBJECT INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:132 Desc No such object INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:148 PASSED INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:454 Inactivate uid=buser123,ou=BOU,dc=example,dc=com INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:461 ['/usr/sbin/dsidm', 'standalone1', '-b', 'dc=example,dc=com', 'account', 'lock', 'uid=buser123,ou=BOU,dc=example,dc=com'] INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:465 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,buser123} which should fail with UNWILLING_TO_PERFORM. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:469 Exception (expected): UNWILLING_TO_PERFORM INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:470 Desc Server is unwilling to perform INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:473 Bind as {uid=buser123,ou=BOU,dc=example,dc=com,bogus} which should fail with UNWILLING_TO_PERFORM. INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:477 Exception (expected): UNWILLING_TO_PERFORM INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:478 Desc Server is unwilling to perform INFO  tests.suites.acl.repeated_ldap_add_test:repeated_ldap_add_test.py:481 SUCCESS
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, NESTED_ROLE_TESTER)] 0.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, NESTED_ROLE_TESTER)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(MARY_ROLE, NOT_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, OR_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, OR_RULE_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(STEVE_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(HARRY_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_positive[(MARY_ROLE, ALL_ACCESS)] 0.04
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(MARY_ROLE, NESTED_ROLE_TESTER)] 0.29
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(STEVE_ROLE, NOT_RULE_ACCESS)] 0.06
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(HARRY_ROLE, NOT_RULE_ACCESS)] 0.28
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_seealso_negative[(MARY_ROLE , OR_RULE_ACCESS)] 0.06
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_positive[NOT_RULE_ACCESS] 0.01
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_positive[ALL_ACCESS] 0.01
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_negaive[NESTED_ROLE_TESTER] 0.02
No log output captured.
Passed suites/acl/roledn_test.py::test_mod_anonseealso_negaive[OR_RULE_ACCESS] 0.02
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_non_leaf 0.50
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_wildcard_non_leaf 0.57
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with__target_set_on_wildcard_leaf 0.70
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_equality_search 0.27
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_equality_search_two 0.58
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_substring_search 0.29
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_substring_search_two 2.00
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_boolean_or_of_two_equality_search 0.21
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_to__userdn_two 0.49
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_userdn 0.56
No log output captured.
Passed suites/acl/search_real_part2_test.py::test_deny_all_access_with_targetfilter_using_presence_search 0.21
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url 0.73
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url_two 0.42
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_search_access_to_userdn_with_ldap_url_matching_all_users 0.61
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_a_dynamic_group 0.50
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_with_host_port_set_on_ldap_url 0.60
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_with_scope_set_to_one_in_ldap_url 0.73
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_read_access_to_dynamic_group_two 0.72
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_deny_access_to_group_should_deny_access_to_all_uniquemember 0.61
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_entry_with_lots_100_attributes 11.73
No log output captured.
Passed suites/acl/search_real_part3_test.py::test_groupdnattr_value_is_another_group 0.14
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with_target_set 0.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/search_real_test.py::test_deny_all_access_to_a_target_with_wild_card 0.26
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_without_a_target_set 1.84
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_read_search_and_compare_access_with_target_and_targetattr_set 1.45
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_read_access_to_multiple_groupdns 1.05
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_to_userdnattr 0.22
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with__target_set 0.55
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with__targetattr_set 1.40
No log output captured.
Passed suites/acl/search_real_test.py::test_deny_all_access_with_targetattr_set 1.06
No log output captured.
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_add 0.78
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389:selfdn_permissions_test.py:58 Add OCticket47653 that allows 'member' attribute INFO  lib389:selfdn_permissions_test.py:63 Add cn=bind_entry, dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:106 ######################### ADD ###################### INFO  lib389:selfdn_permissions_test.py:109 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:139 Try to add Add cn=test_entry, dc=example,dc=com (aci is missing): dn: cn=test_entry, dc=example,dc=com cn: test_entry member: cn=bind_entry, dc=example,dc=com objectclass: top objectclass: person objectclass: OCticket47653 postalAddress: here postalCode: 1234 sn: test_entry INFO  lib389:selfdn_permissions_test.py:143 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:147 Bind as cn=Directory Manager and add the ADD SELFDN aci INFO  lib389:selfdn_permissions_test.py:159 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:164 Try to add Add cn=test_entry, dc=example,dc=com (member is missing) INFO  lib389:selfdn_permissions_test.py:172 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:178 Try to add Add cn=test_entry, dc=example,dc=com (with several member values) INFO  lib389:selfdn_permissions_test.py:181 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:184 Try to add Add cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_search 0.40
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:205 ######################### SEARCH ###################### INFO  lib389:selfdn_permissions_test.py:207 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:211 Try to search cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:216 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:selfdn_permissions_test.py:229 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:233 Try to search cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_modify 0.65
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:256 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:259 ######################### MODIFY ###################### INFO  lib389:selfdn_permissions_test.py:263 Try to modify cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:267 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:271 Bind as cn=Directory Manager and add the WRITE SELFDN aci INFO  lib389:selfdn_permissions_test.py:284 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:288 Try to modify cn=test_entry, dc=example,dc=com. It should succeeds
Passed suites/acl/selfdn_permissions_test.py::test_selfdn_permission_delete 0.27
-------------------------------Captured log call--------------------------------
INFO  lib389:selfdn_permissions_test.py:314 ######################### DELETE ###################### INFO  lib389:selfdn_permissions_test.py:317 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:322 Try to delete cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:selfdn_permissions_test.py:325 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:selfdn_permissions_test.py:329 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:selfdn_permissions_test.py:341 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:selfdn_permissions_test.py:345 Try to delete cn=test_entry, dc=example,dc=com should be successful
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_1] 0.04
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_2] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_3] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_4] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_5] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_6] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_7] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_8] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_9] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_10] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_11] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_12] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_13] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_14] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_15] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_16] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_17] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_19] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_21] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_22] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_targattrfilters_23] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_acl_mispel] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_acl_string] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Wrong_version_string] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Missing_version_string] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Authenticate_statement] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Multiple_targets] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Target_set_to_self] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_ldap_instead_of_ldap] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_more_than_three] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_target_set_with_less_than_three] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_bind_rule_set_with_less_than_three] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_semicolon_instead_of_comma_in_permission] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_in_the_target] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_use_double_equal_instead_of_equal_in_user_and_group_access] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_donot_cote_the_name_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_1] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_2] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_extra_parentheses_case_3] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_no_semicolon_at_the_end_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_a_character_different_of_a_semicolon_at_the_end_of_the_aci] 0.02
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_bad_filter] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_in_the_targattrfilters] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_aci_invalid_syntax[test_Use_double_equal_instead_of_equal_inside_the_targattrfilters] 0.03
No log output captured.
Passed suites/acl/syntax_test.py::test_target_set_above_the_entry_test 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,ROLEDNACCESS)] 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,USERDNACCESS)] 0.01
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,GROUPDNACCESS)] 0.01
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,LDAPURLACCESS)] 0.01
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(CAN,ATTRNAMEACCESS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_0, OU_2)] 0.29
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_1,ANCESTORS)] 0.04
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_2,GRANDPARENTS)] 0.04
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,OU_2)] 0.27
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4, ANCESTORS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,GRANDPARENTS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,PARENTS)] 0.02
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_positive[(LEVEL_4,CHILDREN)] 0.07
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,ROLEDNACCESS)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,USERDNACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,GROUPDNACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,LDAPURLACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(CANNOT,ATTRNAMEACCESS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0, ANCESTORS)] 0.06
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,GRANDPARENTS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,PARENTS)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_0,CHILDREN)] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_2,PARENTS)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_mod_see_also_negative[(LEVEL_4,GRANDSONS)] 0.05
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=USERDNACCESS,ou=Accounting,dc=example,dc=com] 0.06
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=ROLEDNACCESS,ou=Accounting,dc=example,dc=com] 0.03
No log output captured.
Passed suites/acl/userattr_test.py::test_last_three[uid=Ananda Borah,ou=Accounting,dc=example,dc=com-uid=GROUPDNACCESS,ou=Accounting,dc=example,dc=com] 0.03
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_we_can_search_as_expected 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/valueacl_part2_test.py::test_we_can_mod_title_as_expected 0.06
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_modify_with_multiple_filters 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_denied_by_multiple_filters 0.06
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_allowed_add_one_attribute 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_cannot_add_an_entry_with_attribute_values_we_are_not_allowed_add 0.34
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_on_modrdn 0.04
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_on_modrdn_allow 0.05
No log output captured.
Passed suites/acl/valueacl_part2_test.py::test_targattrfilters_keyword 0.38
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_not_allowed_to_delete 0.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/acl/valueacl_test.py::test_donot_allow_write_access_to_title_if_value_is_not_architect 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_allowed_to_delete 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_delete_an_attribute_value_we_are_not_allowed_to_deleted 0.31
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_modify_replace 0.27
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_modify_delete 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_replace_an_attribute_if_we_lack 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_have_del_rights_to_all_attr_value 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_donot_have_del_rights_to_all_attr_value 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_remove_an_attribute_if_we_have_del_rights_to_all_attr_values 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_cantnot_delete_an_entry_with_attribute_values_we_are_not_allowed_delete 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_we_can_add_and_delete_an_entry_with_attribute_values_we_are_allowed_add_and_delete 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_title 0.08
No log output captured.
Passed suites/acl/valueacl_test.py::test_allow_to_modify 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_selfwrite_does_not_confer_write_on_a_targattrfilters_atribute 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_selfwrite_continues_to_give_rights_to_attr_in_targetattr_list 0.06
No log output captured.
Passed suites/acl/valueacl_test.py::test_add_an_attribute_value_we_are_allowed_to_add_with_ldapanyone 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_hierarchy 0.07
No log output captured.
Passed suites/acl/valueacl_test.py::test_targattrfilters_and_search_permissions_and_that_ldapmodify_works_as_expected 0.05
No log output captured.
Passed suites/acl/valueacl_test.py::test_targattrfilters_and_search_permissions_and_that_ldapmodify_works_as_expected_two 0.02
No log output captured.
Passed suites/attr_encryption/attr_encryption_test.py::test_basic 6.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:35 Enable TLS for attribute encryption INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:38 Enables attribute encryption INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:42 Enables attribute encryption for employeeNumber and telephoneNumber INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:46 Add a test user with encrypted attributes
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:81 Restart the server INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:87 Extracting values of cn from the list of objects in encrypt_attrs INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:88 And appending the cn values in a list INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:93 Check employeenumber encryption is enabled INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:96 Check telephoneNumber encryption is enabled INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:99 Check that encrypted attribute is present for user i.e. telephoneNumber
Passed suites/attr_encryption/attr_encryption_test.py::test_export_import_ciphertext 13.05
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_ciphertext.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:127 Export data as ciphertext INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:138 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:139 Check that the encrypted value of attribute is not present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:145 Delete the test user entry with encrypted data INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:148 Import data as ciphertext, which was exported previously INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:159 Check that the data with encrypted attribute is imported properly
Passed suites/attr_encryption/attr_encryption_test.py::test_export_import_plaintext 15.55
------------------------------Captured stderr call------------------------------
[30/Oct/2020:20:27:16.102450591 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [30/Oct/2020:20:27:16.108217458 -0400] - INFO - slapd_extract_cert - CA CERT NAME: Self-Signed-CA [30/Oct/2020:20:27:16.111273114 -0400] - ERR - slapd_extract_cert - Unable to open "/tmp/slapd-standalone1/Self-Signed-CA.pem" for writing (-5950, 2). [30/Oct/2020:20:27:16.114806770 -0400] - WARN - Security Initialization - SSL alert: Sending pin request to SVRCore. You may need to run systemd-tty-ask-password-agent to provide the password. [30/Oct/2020:20:27:16.320942067 -0400] - INFO - slapd_extract_cert - SERVER CERT NAME: Server-Cert [30/Oct/2020:20:27:16.324842785 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [30/Oct/2020:20:27:16.329208628 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_plaintext.ldif [30/Oct/2020:20:27:23.067721329 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [30/Oct/2020:20:27:23.071418399 -0400] - INFO - slapd_extract_cert - CA CERT NAME: Self-Signed-CA [30/Oct/2020:20:27:23.074435145 -0400] - ERR - slapd_extract_cert - Unable to open "/tmp/slapd-standalone1/Self-Signed-CA.pem" for writing (-5950, 2). [30/Oct/2020:20:27:23.077894612 -0400] - WARN - Security Initialization - SSL alert: Sending pin request to SVRCore. You may need to run systemd-tty-ask-password-agent to provide the password. [30/Oct/2020:20:27:23.274541787 -0400] - INFO - slapd_extract_cert - SERVER CERT NAME: Server-Cert [30/Oct/2020:20:27:23.278728767 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there [30/Oct/2020:20:27:23.282722594 -0400] - WARN - Security Initialization - /tmp is not a private namespace. pem files not exported there
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:189 Export data as plain text INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:200 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:201 Check that the plain text value of the encrypted attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:205 Delete the test user entry with encrypted data INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:208 Import data as plain text, which was exported previously INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:219 Check that the attribute is imported properly
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_unindexed 5.88
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/emp_num_ciphertext.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:242 Export data as cipher text INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:253 Check that the attribute is present in the exported file INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:254 Check that the encrypted value of attribute is not present in the exported file
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_multiple_backends 8.48
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db1.ldif ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db2.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:287 Add two test backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:307 Enables attribute encryption for telephoneNumber in test_backend1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:312 Enables attribute encryption for employeeNumber in test_backend2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:317 Add a test user with encrypted attributes in both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:326 Export data as ciphertext from both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:343 Check that the attribute is present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:344 Check that the encrypted value of attribute is not present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:350 Check that the attribute is present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:351 Check that the encrypted value of attribute is not present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:357 Delete test backends
Passed suites/attr_encryption/attr_encryption_test.py::test_attr_encryption_backends 10.32
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db1.ldif ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export_db2.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:386 Add two test backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:405 Enables attribute encryption for telephoneNumber in test_backend1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:410 Add a test user with telephoneNumber in both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:419 Export data as ciphertext from both backends INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:436 Check that the attribute is present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:437 Check that the encrypted value of attribute is not present in the exported file in db1 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:443 Check that the attribute is present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:444 Check that the value of attribute is also present in the exported file in db2 INFO  tests.suites.attr_encryption.attr_encryption_test:attr_encryption_test.py:450 Delete test backends
Passed suites/automember_plugin/automember_mod_test.py::test_mods 11.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.automember_plugin.automember_mod_test:automember_mod_test.py:135 Test PASSED
Passed suites/automember_plugin/automember_test.py::test_automemberscope 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/automember_plugin/automember_test.py::test_automemberfilter 0.01
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_adduser 0.30
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_delete_default_group 4.45
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_no_default_group 4.34
No log output captured.
Passed suites/automember_plugin/automember_test.py::test_delete_target_group 4.58
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_disable_the_plug_in 0.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
Passed suites/automember_plugin/basic_test.py::test_custom_config_area 0.02
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_ability_to_control_behavior_of_modifiers_name 9.26
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_posixaccount_objectclass_automemberdefaultgroup 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_duplicated_member_attributes_added_when_the_entry_is_re_created 0.36
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_multi_valued_automemberdefaultgroup_for_hostgroups 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_plugin_creates_member_attributes_of_the_automemberdefaultgroup 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_multi_valued_automemberdefaultgroup_with_uniquemember 8.70
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_invalid_automembergroupingattr_member 0.31
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_valid_and_invalid_automembergroupingattr 0.39
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_add_regular_expressions_for_user_groups_and_check_for_member_attribute_after_adding_users 0.17
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_22-5288-5289-Contractor-5291-5292-Contractors] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_21-1161-1162-Contractor-1162-1163-Contractors] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_20-1188-1189-CEO-1191-1192-Contractors] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_15-9288-9289-Manager-9291-9292-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_14-561-562-Manager-562-563-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_matching_gid_role_inclusive_regular_expression[autoMembers_13-9788-9789-VPEngg-9392-9393-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_26-5788-5789-Intern-Contractors-SuffDef1-5] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_25-9788-9789-Employee-Contractors-Managers-1] 0.04
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_24-1110-1111-Employee-Contractors-SuffDef1-5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_23-2788-2789-Contractor-Contractors-SuffDef1-5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_19-5788-5789-HRManager-Managers-SuffDef1-5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_18-6788-6789-Junior-Managers-SuffDef1-5] 0.36
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_17-562-563-Junior-Managers-SuffDef1-5] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_gid_and_role_inclusive_exclusive_regular_expression[autoMembers_16-6788-6789-Manager-Managers-SuffDef1-5] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_32-555-720-Employee-SubDef1-SubDef3] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_31-515-200-Junior-SubDef1-SubDef5] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_30-999-400-Supervisor-SubDef1-SubDef2] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_contractors_exclusive_regex_rules_member_uid[autoMembers_28-555-3663-ContractHR-Contractors,cn=subsuffGroups-Managers,cn=subsuffGroups] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_27-595-690-ContractHR-Managers-Contractors] 0.08
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_29-8195-2753-Employee-Contractors-Managers] 0.07
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_33-545-3333-Supervisor-Contractors-Managers] 0.06
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_managers_inclusive_regex_rule[autoMembers_34-8195-693-Temporary-Managers-Contractors] 0.05
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_reject_invalid_config_and_we_donot_deadlock_the_server 10.36
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_re_build_task 10.99
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_export_task 9.29
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10302020_203007) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_mapping 2.19
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1087 Automember Map Updates task (task-10302020_203010) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_re_build 8.29
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_export 13.21
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10302020_203029) completed successfully
Passed suites/automember_plugin/basic_test.py::test_automemtask_run_re_build 18.43
No log output captured.
Passed suites/automember_plugin/basic_test.py::test_automemtask_run_export 15.26
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10302020_203106) completed successfully
Passed suites/automember_plugin/configuration_test.py::test_configuration 4.61
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/backups/backup_test.py::test_missing_backend 4.97
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/basic/basic_test.py::test_basic_ops 0.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/basic/basic_test.py::test_basic_import_export 47.83
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif
Passed suites/basic/basic_test.py::test_basic_backup 9.57
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:619 Backup task backup_10302020_203241 completed successfully INFO  lib389:tasks.py:673 Restore task restore_10302020_203243 completed successfully
Passed suites/basic/basic_test.py::test_basic_db2index 5.76
------------------------------Captured stderr call------------------------------
[30/Oct/2020:20:32:53.776387076 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [30/Oct/2020:20:32:53.783102850 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7578456064, process usage 22843392 [30/Oct/2020:20:32:53.786059986 -0400] - INFO - check_and_set_import_cache - Import allocates 2960334KB import cache. [30/Oct/2020:20:32:53.789501358 -0400] - INFO - bdb_copy_directory - Backing up file 0 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/givenName.db) [30/Oct/2020:20:32:53.792286133 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/givenName.db [30/Oct/2020:20:32:53.797350623 -0400] - INFO - bdb_copy_directory - Backing up file 1 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/aci.db) [30/Oct/2020:20:32:53.800851962 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/aci.db [30/Oct/2020:20:32:53.803556618 -0400] - INFO - bdb_copy_directory - Backing up file 2 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/sn.db) [30/Oct/2020:20:32:53.806359695 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/sn.db [30/Oct/2020:20:32:53.809057689 -0400] - INFO - bdb_copy_directory - Backing up file 3 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/numsubordinates.db) [30/Oct/2020:20:32:53.811738647 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/numsubordinates.db [30/Oct/2020:20:32:53.814235429 -0400] - INFO - bdb_copy_directory - Backing up file 4 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/entryusn.db) [30/Oct/2020:20:32:53.817812359 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryusn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/entryusn.db [30/Oct/2020:20:32:53.820605820 -0400] - INFO - bdb_copy_directory - Backing up file 5 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/nsuniqueid.db) [30/Oct/2020:20:32:53.823685244 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/nsuniqueid.db [30/Oct/2020:20:32:53.826289587 -0400] - INFO - bdb_copy_directory - Backing up file 6 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/ancestorid.db) [30/Oct/2020:20:32:53.828725927 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/ancestorid.db [30/Oct/2020:20:32:53.831285311 -0400] - INFO - bdb_copy_directory - Backing up file 7 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/parentid.db) [30/Oct/2020:20:32:53.833830139 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/parentid.db [30/Oct/2020:20:32:53.843724993 -0400] - INFO - bdb_copy_directory - Backing up file 8 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/mail.db) [30/Oct/2020:20:32:53.848903563 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/mail.db [30/Oct/2020:20:32:53.853147545 -0400] - INFO - bdb_copy_directory - Backing up file 9 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/telephoneNumber.db) [30/Oct/2020:20:32:53.857778385 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/telephoneNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/telephoneNumber.db [30/Oct/2020:20:32:53.861120629 -0400] - INFO - bdb_copy_directory - Backing up file 10 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/cn.db) [30/Oct/2020:20:32:53.864362108 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/cn.db [30/Oct/2020:20:32:53.867533312 -0400] - INFO - bdb_copy_directory - Backing up file 11 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/DBVERSION) [30/Oct/2020:20:32:53.870462003 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/DBVERSION [30/Oct/2020:20:32:53.873561195 -0400] - INFO - bdb_copy_directory - Backing up file 12 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/uid.db) [30/Oct/2020:20:32:53.876709604 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/uid.db [30/Oct/2020:20:32:53.879870650 -0400] - INFO - bdb_copy_directory - Backing up file 13 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/entryrdn.db) [30/Oct/2020:20:32:53.884339577 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/entryrdn.db [30/Oct/2020:20:32:53.887947202 -0400] - INFO - bdb_copy_directory - Backing up file 14 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/uniquemember.db) [30/Oct/2020:20:32:53.891240335 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uniquemember.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/uniquemember.db [30/Oct/2020:20:32:53.894048371 -0400] - INFO - bdb_copy_directory - Backing up file 15 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/id2entry.db) [30/Oct/2020:20:32:53.899323199 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/id2entry.db [30/Oct/2020:20:32:53.904728480 -0400] - INFO - bdb_copy_directory - Backing up file 16 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/objectclass.db) [30/Oct/2020:20:32:53.916929548 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/userRoot/objectclass.db [30/Oct/2020:20:32:53.921740825 -0400] - INFO - upgradedb_core - userRoot: Start upgradedb. [30/Oct/2020:20:32:53.926247054 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [30/Oct/2020:20:32:53.930459014 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7577755648, process usage 23764992 [30/Oct/2020:20:32:53.934101890 -0400] - INFO - check_and_set_import_cache - Import allocates 2960060KB import cache. [30/Oct/2020:20:32:54.189522112 -0400] - INFO - bdb_import_main - reindex userRoot: Index buffering enabled with bucket size 100 [30/Oct/2020:20:32:54.895800043 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers finished; cleaning up... [30/Oct/2020:20:32:55.100382397 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers cleaned up. [30/Oct/2020:20:32:55.107158493 -0400] - INFO - bdb_import_main - reindex userRoot: Cleaning up producer thread... [30/Oct/2020:20:32:55.120814996 -0400] - INFO - bdb_import_main - reindex userRoot: Indexing complete. Post-processing... [30/Oct/2020:20:32:55.125487709 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numsubordinates (this may take several minutes to complete)... [30/Oct/2020:20:32:55.128784409 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numSubordinates complete. [30/Oct/2020:20:32:55.132351907 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Gathering ancestorid non-leaf IDs... [30/Oct/2020:20:32:55.135823718 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Finished gathering ancestorid non-leaf IDs. [30/Oct/2020:20:32:55.139236992 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Starting sort of ancestorid non-leaf IDs... [30/Oct/2020:20:32:55.142126599 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Finished sort of ancestorid non-leaf IDs. [30/Oct/2020:20:32:55.150604294 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Creating ancestorid index (new idl)... [30/Oct/2020:20:32:55.157027221 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Created ancestorid index (new idl). [30/Oct/2020:20:32:55.160869203 -0400] - INFO - bdb_import_main - reindex userRoot: Flushing caches... [30/Oct/2020:20:32:55.164094788 -0400] - INFO - bdb_import_main - reindex userRoot: Closing files... [30/Oct/2020:20:32:55.279814835 -0400] - INFO - bdb_import_main - reindex userRoot: Reindexing complete. Processed 160 entries in 1 seconds. (160.00 entries/sec) [30/Oct/2020:20:32:55.287308741 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/log.0000000001 to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/log.0000000001 [30/Oct/2020:20:32:55.301758277 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T20:32:53.654517/DBVERSION [30/Oct/2020:20:32:55.308242539 -0400] - INFO - bdb_pre_close - All database threads now stopped [30/Oct/2020:20:32:55.478113014 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [30/Oct/2020:20:32:55.495564938 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [30/Oct/2020:20:32:55.503219646 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [30/Oct/2020:20:32:55.507787370 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7577415680, process usage 22994944 [30/Oct/2020:20:32:55.511053516 -0400] - INFO - check_and_set_import_cache - Import allocates 2959928KB import cache. [30/Oct/2020:20:32:55.662693980 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: uid [30/Oct/2020:20:32:55.667285833 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [30/Oct/2020:20:32:55.673393292 -0400] - ERR - bdb_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [30/Oct/2020:20:32:55.686496990 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [30/Oct/2020:20:32:55.709307333 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed suites/basic/basic_test.py::test_basic_acl 0.41
No log output captured.
Passed suites/basic/basic_test.py::test_basic_searches 0.09
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs0-cn-False] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs1-cn-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs2-nsUniqueId-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs3-cn-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_search_req_attrs[attrs4-cn-True] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_basic_referrals 3.67
No log output captured.
Passed suites/basic/basic_test.py::test_basic_systemctl 11.99
No log output captured.
Passed suites/basic/basic_test.py::test_basic_ldapagent 5.04
No log output captured.
Passed suites/basic/basic_test.py::test_basic_dse_survives_kill9 11.36
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[namingContexts] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedLDAPVersion] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedControl] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedExtension] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[supportedSASLMechanisms] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[vendorName] 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_def_rootdse_attr[vendorVersion] 0.03
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[namingContexts] 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedLDAPVersion] 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedControl] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedExtension] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[supportedSASLMechanisms] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[vendorName] 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_mod_def_rootdse_attr[vendorVersion] 0.00
No log output captured.
Passed suites/basic/basic_test.py::test_basic_anonymous_search 0.04
No log output captured.
Passed suites/basic/basic_test.py::test_search_original_type 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_search_ou 0.01
No log output captured.
Passed suites/basic/basic_test.py::test_connection_buffer_size 0.02
No log output captured.
Passed suites/basic/basic_test.py::test_critical_msg_on_empty_range_idl 6.61
No log output captured.
Passed suites/basic/basic_test.py::test_ldbm_modification_audit_log 11.65
No log output captured.
Passed suites/basic/basic_test.py::test_dscreate 12.60
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_dscreate
Passed suites/basic/basic_test.py::test_dscreate_ldapi 0.00
-----------------------------Captured stdout setup------------------------------
Starting installation... Completed installation for test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef
-------------------------------Captured log call--------------------------------
DEBUG  RootDSE:_mapped_object.py:635 get_attr_vals('supportedControl')
Passed suites/basic/basic_test.py::test_dscreate_multiple_dashes_name 17.04
-----------------------------Captured stdout setup------------------------------
Starting installation... Completed installation for test-longname-deadbeef-deadbeef-deadbeef-deadbeef-deadbeef
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[c=uk] 15.53
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_different_rdn
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[cn=test_user] 15.60
-----------------------------Captured stdout setup------------------------------
Removing instance ... Completed instance removal
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_different_rdn
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[dc=example,dc=com] 15.91
-----------------------------Captured stdout setup------------------------------
Removing instance ... Completed instance removal
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_different_rdn
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[o=south] 15.34
-----------------------------Captured stdout setup------------------------------
Removing instance ... Completed instance removal
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_different_rdn
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[ou=sales] 16.43
-----------------------------Captured stdout setup------------------------------
Removing instance ... Completed instance removal
------------------------------Captured stdout call------------------------------
Starting installation... Completed installation for test_different_rdn
Passed suites/basic/basic_test.py::test_dscreate_with_different_rdn[wrong=some_value] 13.43
-----------------------------Captured stdout setup------------------------------
Removing instance ... Completed instance removal
------------------------------Captured stdout call------------------------------
Starting installation... Error: Instance creation failed! Suffix RDN 'wrong' in 'wrong=some_value' is not supported. Supported RDN's are: 'c', 'cn', 'dc', 'o', and 'ou'
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.basic.basic_test:basic_test.py:1474 dscreate failed! Error (1) None
Passed suites/betxns/betxn_test.py::test_betxt_7bit 4.67
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:52 Running test_betxt_7bit... INFO  tests.suites.betxns.betxn_test:betxn_test.py:78 test_betxt_7bit: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_attr_uniqueness 4.51
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:133 test_betxn_attr_uniqueness: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_memberof 4.67
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:179 test_betxn_memberof: PASSED
Passed suites/betxns/betxn_test.py::test_betxn_modrdn_memberof_cache_corruption 4.88
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:233 test_betxn_modrdn_memberof: PASSED
Passed suites/betxns/betxn_test.py::test_ri_and_mep_cache_corruption 0.90
-------------------------------Captured log call--------------------------------
INFO  tests.suites.betxns.betxn_test:betxn_test.py:357 Test PASSED
Passed suites/clu/clu_test.py::test_clu_pwdhash 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.clu_test:clu_test.py:40 Running test_clu_pwdhash... INFO  tests.suites.clu.clu_test:clu_test.py:54 pwdhash generated: {SSHA}SBMohlfOeZ7WORdsz+T6Ug/E9QMR2dbcNp7EOg== INFO  tests.suites.clu.clu_test:clu_test.py:55 test_clu_pwdhash: PASSED
Passed suites/clu/clu_test.py::test_clu_pwdhash_mod 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.clu_test:clu_test.py:78 Running test_clu_pwdhash_mod... INFO  tests.suites.clu.clu_test:clu_test.py:87 pwdhash generated: {SSHA256}LA5t6EHmvE6JWV22PNEuCGyFx3VxXDk66CWS/vL1Gnz4wcGV/iNOOA== INFO  tests.suites.clu.clu_test:clu_test.py:88 returned the hashed string using the algorithm set in nsslapd-rootpwstoragescheme
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_users 6.88
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:119 Run ldifgen to create users ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=people,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - number=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - rdn-cn=False INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - generic=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - start-idx=50 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - localize=False INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:196 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:122 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:127 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:48 Stopping the server and running offline import... INFO  tests.suites.clu.dbgen_test:dbgen_test.py:133 Check that accounts are imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_groups 34.25
------------------------------Captured stderr call------------------------------
ldap_add: Already exists (68) ldap_add: Already exists (68) ldap_add: Already exists (68)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:183 Run ldifgen to create group ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=myGroup INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=groups,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - number=1 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-members=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-members=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - member-attr=uniquemember INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - member-parent=ou=people,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:250 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:186 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:191 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:200 Check that accounts are imported INFO  tests.suites.clu.dbgen_test:dbgen_test.py:203 Check that group is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_classic 0.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:256 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=classic INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-specifier=businessCategory INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-template=cn=sales,cn=classicCoS,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:259 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:267 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_pointer 0.28
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:322 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=pointer INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def_pointer INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos pointer definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-template=cn=sales,cn=pointerCoS,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:325 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:333 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_indirect 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:387 Run ldifgen to create COS definition ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=indirect INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Postal_Def_indirect INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos indirect definitions,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-specifier=businessCategory INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr=['postalcode', 'telephonenumber'] INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:304 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:390 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:398 Check that COS definition is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_cos_template 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:449 Run ldifgen to create COS template ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Template INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=cos templates,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-priority=1 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - cos-attr-val=postalcode:12345 INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:341 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:452 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:460 Check that COS template is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_managed_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:511 Run ldifgen to create managed role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Managed_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=managed roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=managed INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:514 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:522 Check that managed role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_filtered_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:571 Run ldifgen to create filtered role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Filtered_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=filtered roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=filtered INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - filter="objectclass=posixAccount" INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:574 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:582 Check that filtered role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_nested_role 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:632 Run ldifgen to create nested role ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - NAME=My_Nested_Role INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=ou=nested roles,dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - type=nested INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - role-dn=['cn=some_role,ou=roles,dc=example,dc=com'] INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:391 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:635 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:643 Check that nested role is imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_mod_ldif_mixed 38.32
------------------------------Captured stderr call------------------------------
ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_modify: Operation not allowed on RDN (67) ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldap_rename: Invalid DN syntax (34) additional info: invalid RDN ldapmodify: extra lines at end (line 43453, entry "uid=user0999,dc=example,dc=com")
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:702 Run ldifgen to create modification ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - parent=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-users=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - delete-users=True INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - create-parent=False INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-users=1000 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - add-users=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - del-users=999 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - modrdn-users=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - mod-users=10 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - mod-attrs=['cn', 'uid', 'sn'] INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - randomize=False INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:467 Successfully created LDIF file: /var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen_test.py:705 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:710 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:719 Check that some accounts are imported
Passed suites/clu/dbgen_test.py::test_dsconf_dbgen_nested_ldif 27.46
------------------------------Captured stderr call------------------------------
ldap_add: Already exists (68)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbgen_test:dbgen_test.py:759 Run ldifgen to create nested ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:56 Generating LDIF with the following options: INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - suffix=dc=example,dc=com INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - node-limit=100 INFO  tests.suites.clu.dbgen_test:dbgen.py:61 - num-users=600 INFO  tests.suites.clu.dbgen_test:dbgen.py:62 - ldif-file=/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif INFO  tests.suites.clu.dbgen_test:dbgen.py:63 Writing LDIF ... INFO  tests.suites.clu.dbgen_test:dbgen.py:500 Successfully created nested LDIF file (/var/lib/dirsrv/slapd-standalone1/ldif/created.ldif) containing 6 nodes/subtrees INFO  tests.suites.clu.dbgen_test:dbgen_test.py:762 Check if file exists INFO  tests.suites.clu.dbgen_test:dbgen_test.py:67 Check if content is present in output INFO  tests.suites.clu.dbgen_test:dbgen_test.py:71 Reset log file for next test INFO  tests.suites.clu.dbgen_test:dbgen_test.py:767 Get number of accounts before import INFO  tests.suites.clu.dbgen_test:dbgen_test.py:57 Add entries from ldif file with ldapmodify INFO  tests.suites.clu.dbgen_test:dbgen_test.py:779 Check that accounts are imported
Passed suites/clu/dbmon_test.py::test_dsconf_dbmon 0.57
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbmon_test:dbmon_test.py:164 Sanity check for syntax INFO  LogCapture:monitor.py:247 DB Monitor Report: 2020-10-30 20:39:35 INFO  LogCapture:monitor.py:248 -------------------------------------------------------- INFO  LogCapture:monitor.py:249 Database Cache: INFO  LogCapture:monitor.py:250 - Cache Hit Ratio: 100% INFO  LogCapture:monitor.py:251 - Free Space: 486.95 MB INFO  LogCapture:monitor.py:252 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:253 - RO Page Drops: 0 INFO  LogCapture:monitor.py:254 - Pages In: 0 INFO  LogCapture:monitor.py:255 - Pages Out: 0 INFO  LogCapture:monitor.py:256 INFO  LogCapture:monitor.py:257 Normalized DN Cache: INFO  LogCapture:monitor.py:258 - Cache Hit Ratio: 72% INFO  LogCapture:monitor.py:259 - Free Space: 19.99 MB INFO  LogCapture:monitor.py:260 - Free Percentage: 99.9% INFO  LogCapture:monitor.py:261 - DN Count: 71 INFO  LogCapture:monitor.py:262 - Evictions: 0 INFO  LogCapture:monitor.py:263 INFO  LogCapture:monitor.py:264 Backends: INFO  LogCapture:monitor.py:266 - dc=example,dc=com (userRoot): INFO  LogCapture:monitor.py:267 - Entry Cache Hit Ratio: 40% INFO  LogCapture:monitor.py:268 - Entry Cache Count: 5 INFO  LogCapture:monitor.py:269 - Entry Cache Free Space: 1.31 GB INFO  LogCapture:monitor.py:270 - Entry Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:271 - Entry Cache Average Size: 3.65 KB INFO  LogCapture:monitor.py:272 - DN Cache Hit Ratio: 0% INFO  LogCapture:monitor.py:273 - DN Cache Count: 5 INFO  LogCapture:monitor.py:274 - DN Cache Free Space: 192.0 MB INFO  LogCapture:monitor.py:275 - DN Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:276 - DN Cache Average Size: 67.0 B INFO  LogCapture:monitor.py:286 INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log INFO  tests.suites.clu.dbmon_test:dbmon_test.py:171 Sanity check for --indexes output INFO  LogCapture:monitor.py:247 DB Monitor Report: 2020-10-30 20:39:35 INFO  LogCapture:monitor.py:248 -------------------------------------------------------- INFO  LogCapture:monitor.py:249 Database Cache: INFO  LogCapture:monitor.py:250 - Cache Hit Ratio: 100% INFO  LogCapture:monitor.py:251 - Free Space: 486.95 MB INFO  LogCapture:monitor.py:252 - Free Percentage: 100.0% INFO  LogCapture:monitor.py:253 - RO Page Drops: 0 INFO  LogCapture:monitor.py:254 - Pages In: 0 INFO  LogCapture:monitor.py:255 - Pages Out: 0 INFO  LogCapture:monitor.py:256 INFO  LogCapture:monitor.py:257 Normalized DN Cache: INFO  LogCapture:monitor.py:258 - Cache Hit Ratio: 72% INFO  LogCapture:monitor.py:259 - Free Space: 19.99 MB INFO  LogCapture:monitor.py:260 - Free Percentage: 99.9% INFO  LogCapture:monitor.py:261 - DN Count: 71 INFO  LogCapture:monitor.py:262 - Evictions: 0 INFO  LogCapture:monitor.py:263 INFO  LogCapture:monitor.py:264 Backends: INFO  LogCapture:monitor.py:266 - dc=example,dc=com (userRoot): INFO  LogCapture:monitor.py:267 - Entry Cache Hit Ratio: 40% INFO  LogCapture:monitor.py:268 - Entry Cache Count: 5 INFO  LogCapture:monitor.py:269 - Entry Cache Free Space: 1.31 GB INFO  LogCapture:monitor.py:270 - Entry Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:271 - Entry Cache Average Size: 3.65 KB INFO  LogCapture:monitor.py:272 - DN Cache Hit Ratio: 0% INFO  LogCapture:monitor.py:273 - DN Cache Count: 5 INFO  LogCapture:monitor.py:274 - DN Cache Free Space: 192.0 MB INFO  LogCapture:monitor.py:275 - DN Cache Free Percentage: 100.0% INFO  LogCapture:monitor.py:276 - DN Cache Average Size: 67.0 B INFO  LogCapture:monitor.py:278 - Indexes: INFO  LogCapture:monitor.py:280 - Index: cn.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: parentid.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: ancestorid.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: nsuniqueid.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: objectclass.db INFO  LogCapture:monitor.py:281 - Cache Hit: 14 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: uid.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: numsubordinates.db INFO  LogCapture:monitor.py:281 - Cache Hit: 0 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: entryrdn.db INFO  LogCapture:monitor.py:281 - Cache Hit: 16 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: aci.db INFO  LogCapture:monitor.py:281 - Cache Hit: 3 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:280 - Index: id2entry.db INFO  LogCapture:monitor.py:281 - Cache Hit: 8 INFO  LogCapture:monitor.py:282 - Cache Miss: 0 INFO  LogCapture:monitor.py:283 - Page In: 0 INFO  LogCapture:monitor.py:284 - Page Out: 0 INFO  LogCapture:monitor.py:285 INFO  LogCapture:monitor.py:286 INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log INFO  tests.suites.clu.dbmon_test:dbmon_test.py:179 Sanity check for --json output INFO  LogCapture:monitor.py:245 { "date": "2020-10-30 20:39:35", "dbcache": { "hit_ratio": "100", "free": "486.95 MB", "free_percentage": "100.0", "roevicts": "0", "pagein": "0", "pageout": "0" }, "ndncache": { "hit_ratio": "72", "free": "19.99 MB", "free_percentage": "99.9", "count": "71", "evictions": "0" }, "backends": { "userRoot": { "suffix": "dc=example,dc=com", "entry_cache_count": "5", "entry_cache_free": "1.31 GB", "entry_cache_free_percentage": "100.0", "entry_cache_size": "3.65 KB", "entry_cache_hit_ratio": "40", "dn_cache_count": "5", "dn_cache_free": "192.0 MB", "dn_cache_free_percentage": "100.0", "dn_cache_size": "67.0 B", "dn_cache_hit_ratio": "0", "indexes": [ { "name": "cn.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "parentid.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "ancestorid.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "nsuniqueid.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "objectclass.db", "cachehit": "14", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "uid.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "numsubordinates.db", "cachehit": "0", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "entryrdn.db", "cachehit": "16", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "aci.db", "cachehit": "3", "cachemiss": "0", "pagein": "0", "pageout": "0" }, { "name": "id2entry.db", "cachehit": "8", "cachemiss": "0", "pagein": "0", "pageout": "0" } ] } } } INFO  tests.suites.clu.dbmon_test:dbmon_test.py:133 Clear the log
Passed suites/clu/dbverify_test.py::test_dsctl_dbverify 2.65
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
[30/Oct/2020:20:39:49.327178660 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dbverify_test:dbverify_test.py:63 Run dbverify INFO  tests.suites.clu.dbverify_test:dbtasks.py:88 dbverify successful INFO  tests.suites.clu.dbverify_test:dbverify_test.py:67 Check dbverify was successful
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_sssd 4.53
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
# # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307 ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. # ldap_access_filter = (memberOf=<dn>) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307bis ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. ldap_access_filter = (memberOf=cn=new_group,ou=groups,dc=example,dc=com) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:101 Create sssd.conf content DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:114 # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307 ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. # ldap_access_filter = (memberOf=<dn>) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:104 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:51 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:56 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:59 Reset log file for next test INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:107 Now we test allowed_group argument INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:108 Enable MemberOf plugin INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:113 Create test group INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:118 Create sssd.conf content with allowed group DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:114 # # sssd.conf # Generated by 389 Directory Server - dsidm # # For more details see man sssd.conf and man sssd-ldap # Be sure to review the content of this file to ensure it is secure and correct # in your environment. [domain/ldap] # Uncomment this for more verbose logging. # debug_level=3 # Cache hashes of user authentication for offline auth. cache_credentials = True id_provider = ldap auth_provider = ldap access_provider = ldap chpass_provider = ldap ldap_schema = rfc2307bis ldap_search_base = dc=example,dc=com ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records, you can use the following instead. This derives # from your ldap_search_base. # ldap_uri = _srv_ ldap_tls_reqcert = demand # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs ldap_tls_cacertdir = /etc/openldap/certs # Path to the cacert # ldap_tls_cacert = /etc/openldap/certs/ca.crt # Only users who match this filter can login and authorise to this machine. Note # that users who do NOT match, will still have their uid/gid resolve, but they # can't login. ldap_access_filter = (memberOf=cn=new_group,ou=groups,dc=example,dc=com) enumerate = false access_provider = ldap ldap_user_member_of = memberof ldap_user_gecos = cn ldap_user_uuid = nsUniqueId ldap_group_uuid = nsUniqueId # This is really important as it allows SSSD to respect nsAccountLock ldap_account_expire_policy = rhds ldap_access_order = filter, expire # Setup for ssh keys # Inside /etc/ssh/sshd_config add the lines: # AuthorizedKeysCommand /usr/bin/sss_ssh_authorizedkeys # AuthorizedKeysCommandUser nobody # You can test with the command: sss_ssh_authorizedkeys <username> ldap_user_ssh_public_key = nsSshPublicKey # This prevents an issue where the Directory is recursively walked on group # and user look ups. It makes the client faster and more responsive in almost # every scenario. ignore_group_members = False [sssd] services = nss, pam, ssh, sudo config_file_version = 2 domains = ldap [nss] homedir_substring = /home INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:123 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:47 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:59 Reset log file for next test
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_ldap 0.00
------------------------------Captured stdout call------------------------------
# # OpenLDAP client configuration # Generated by 389 Directory Server - dsidm # # See ldap.conf(5) for details # This file should be world readable but not world writable. BASE dc=example,dc=com # Remember to check this: you can have multiple uris on this line. You may have # multiple servers or load balancers in your environment. URI ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records you can use: # URI ldaps:///dc%3Dexample%2Cdc%3Dcom DEREF never # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs TLS_CACERTDIR /etc/openldap/certs # TLS_CACERT /etc/openldap/certs/ca.crt
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:151 Create ldap.conf content DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:155 # # OpenLDAP client configuration # Generated by 389 Directory Server - dsidm # # See ldap.conf(5) for details # This file should be world readable but not world writable. BASE dc=example,dc=com # Remember to check this: you can have multiple uris on this line. You may have # multiple servers or load balancers in your environment. URI ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 # If you have DNS SRV records you can use: # URI ldaps:///dc%3Dexample%2Cdc%3Dcom DEREF never # To use cacert dir, place *.crt files in this path then run: # /usr/bin/openssl rehash /etc/openldap/certs # or (for older versions of openssl) # /usr/bin/c_rehash /etc/openldap/certs TLS_CACERTDIR /etc/openldap/certs # TLS_CACERT /etc/openldap/certs/ca.crt INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:154 Check if config creation was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:51 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:59 Reset log file for next test
Passed suites/clu/dsidm_config_test.py::test_dsidm_config_display 4.41
------------------------------Captured stdout call------------------------------
# This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:195 Test dsidm display option DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:290 # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:198 Check if display option was successful INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:51 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:56 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:59 Reset log file for next test INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:201 Enable MemberOf plugin INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:206 Test dsidm display option with MemberOf plugin DEBUG  tests.suites.clu.dsidm_config_test:client_config.py:290 # This is a generic list of LDAP client configuration parameters you may require # for connecting a client to this server. Some of them may or may not apply # to your application, so consult your application documentation for further # assistance. # # This program makes a number of assumptions about your data and configuration # which may not be correct. Be sure to check these values for your situation. ; ldap uri ; This is the uri of the server you will connect to and authenticate to. It ; must be a valid subjectAltName in the presented TLS certificate. Note that this ; is not an exhaustive list of your LDAP servers, and other applications in your ; network like load balancers may affect this. This is just what we derive from ; your current connection. ldap_uri = ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 ; ldap dns discovery uri ; In some environments, you may have DNS SRV records such as ; "_ldap._tcp.<domain name>". If these are present in your dns server, you can ; use the following uri. ldap_uri = ldaps:///dc%3Dexample%2Cdc%3Dcom ; ca_cert ; To correctly use TLS, you require the valid CA cert that issued your LDAP TLS ; certificates. Sometimes a copy of this may be in your server instance as ca_cert = /etc/dirsrv/slapd-<instance>/ca.crt ; However that's not guaranteed. You can show the certs from the LDAP server ; by sshing to the server and running: certutil -L -d /etc/dirsrv/slapd-<instance>/ ; If you can identify the CA certificate name, you can then view it with: certutil -L -n <ca cert name> -a -d /etc/dirsrv/slapd-<instance>/ ; This should be a pem file you can use in your application's CA. ; Some applications don't require a ca certificate parameter, and will use the ; ca certificate from /etc/openldap/ldap.conf. You should configure ldap.conf ; in these cases. See the 'client_config ldap.conf' command in dsidm. ; basedn ; The basedn is the root suffix where all searches will originate from for ; LDAP objects. basedn = dc=example,dc=com ; schema_type ; LDAP servers have different ways to structure their objects and group ; relationships. Legacy servers will use rfc2307, where as modern servers will ; use rfc2307bis (requires MemberOf plugin to be enabled). This is the schema ; setting of your directory based on your running configuration (if we can ; detect it). schema_type = rfc2307bis ; user/account basedn ; Some applications may optionally use a user/account basedn to limit searches ; in the directory. This can be for performance or security reasons. Generally ; you shouldn't need this, preferring to use groups and filters for access ; control. user_basedn = ou=people,dc=example,dc=com ; user filter ; This is an ldap filter that will return only user objects. Additionally some ; applications will template into the filter (similar to sql statements) or they ; will generate the filter based on attributes. We list a number of possible ; filters you might use, but you should customise this for your application. ; ; If you are using rfc2307bis, you can use this filter to provide authorisation ; support by adding filters such as: (memberOf=<groupdn>) user_filter = (&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount)) user_filter = (&(&(objectclass=nsPerson)(objectclass=nsAccount)(objectclass=nsOrgPerson)(objectclass=posixAccount))(|(uid=<PARAM>)(displayName=<PARAM>)(cn=<PARAM>))) ; group basedn ; Some applications may optionnaly use a group basedn to limit searches in the ; directory. This can be for performance or security reasons. Generally you ; shouldn't need this, preferring to use groups and filters for access control. group_basedn = ou=Groups,dc=example,dc=com ; group filter ; This is an ldap filter that will return only group objects. Additionally ; some applications will template into the filter (similar to sql statements) ; or they will generate the filter base on attributes. We list a number of ; possible filters you might use, but you should customise this for your ; application. group_filter = (&(objectclass=groupOfNames)) group_filter = (&(&(objectclass=groupOfNames))(|(cn=<PARAM>))) ; attribute mappings ; Due to the variety of schemas and attribute mappings in LDAP, there are ; different representations of attributes and values. This is a guess at ; the mappings that exist in your server, and what attributes you should ; configure and use. unique id = nsUniqueId user rdn = uid user identifier = uid group rdn = cn group member attribute = member INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:209 Check if display option was successful with MemberOf plugin enabled INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:51 Check if content is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:56 Check if value is present in output INFO  tests.suites.clu.dsidm_config_test:dsidm_config_test.py:59 Reset log file for next test
Passed suites/clu/fixup_test.py::test_posix_winsync_fixup 8.74
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.clu.fixup_test:fixup_test.py:73 Enable POSIXWinsyncPlugin INFO  tests.suites.clu.fixup_test:fixup_test.py:77 Stopping the server and importing posix accounts INFO  tests.suites.clu.fixup_test:fixup_test.py:87 Run Fixup task INFO  tests.suites.clu.fixup_test:posix_winsync.py:29 Attempting to add task entry... INFO  tests.suites.clu.fixup_test:posix_winsync.py:39 Successfully added task entry INFO  tests.suites.clu.fixup_test:fixup_test.py:90 Check log if fixup task was successful
Passed suites/config/autotuning_test.py::test_threads_basic 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:39 Set nsslapd-threadnumber: -1 to enable autotuning INFO  tests.suites.config.autotuning_test:autotuning_test.py:42 Assert nsslapd-threadnumber is equal to the documented expected value
Passed suites/config/autotuning_test.py::test_threads_warning 1.25
No log output captured.
Passed suites/config/autotuning_test.py::test_threads_invalid_value[-2] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: -2. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_invalid_value[0] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: 0. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_invalid_value[invalid] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:87 Set nsslapd-threadnumber: invalid. Operation should fail
Passed suites/config/autotuning_test.py::test_threads_back_from_manual_value 0.08
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:109 Set nsslapd-threadnumber: -1 to enable autotuning and save the new value INFO  tests.suites.config.autotuning_test:autotuning_test.py:113 Set nsslapd-threadnumber to the autotuned value decreased by 2 INFO  tests.suites.config.autotuning_test:autotuning_test.py:118 Set nsslapd-threadnumber: -1 to enable autotuning INFO  tests.suites.config.autotuning_test:autotuning_test.py:121 Assert nsslapd-threadnumber is back to the autotuned value
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-] 3.41
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:190 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-0] 4.52
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-400] 4.42
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'201326592' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'261517082' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[-40] 5.12
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'261517082' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:180 Delete nsslapd-cache-autosize INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'817240883' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'134217728' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-] 4.44
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'817240883' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'134217728' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:190 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'163448176' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-401] 4.62
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'163448176' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'261517082' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_non_zero[10-0] 4.74
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:169 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:170 nsslapd-dbcachesize == b'261517082' INFO  tests.suites.config.autotuning_test:autotuning_test.py:171 nsslapd-cachememsize == b'469762048' INFO  tests.suites.config.autotuning_test:autotuning_test.py:172 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:173 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:174 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:177 Set nsslapd-cache-autosize to 10 INFO  tests.suites.config.autotuning_test:autotuning_test.py:187 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:196 Trying to set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:199 Trying to set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:216 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:217 nsslapd-dbcachesize == b'163448176' INFO  tests.suites.config.autotuning_test:autotuning_test.py:218 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:219 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:220 nsslapd-cache-autosize == b'10' INFO  tests.suites.config.autotuning_test:autotuning_test.py:221 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[0] 9.46
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'163448176' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'603979776' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'0'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[] 9.60
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:283 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'408620441' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1409286144' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:283 Delete nsslapd-cache-autosize-split INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'25'
Passed suites/config/autotuning_test.py::test_cache_autosize_basic_sane[40] 9.99
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'25' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 0 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'817240883' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:273 Check nsslapd-dbcachesize and nsslapd-cachememsize before the test INFO  tests.suites.config.autotuning_test:autotuning_test.py:274 nsslapd-dbcachesize == b'817240883' INFO  tests.suites.config.autotuning_test:autotuning_test.py:275 nsslapd-cachememsize == b'1140850688' INFO  tests.suites.config.autotuning_test:autotuning_test.py:276 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:277 nsslapd-cache-autosize-split == b'40' INFO  tests.suites.config.autotuning_test:autotuning_test.py:280 Set nsslapd-cache-autosize-split to 40 INFO  tests.suites.config.autotuning_test:autotuning_test.py:289 Set nsslapd-dbcachesize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:291 Set nsslapd-cachememsize to 33333333 INFO  tests.suites.config.autotuning_test:autotuning_test.py:307 Check nsslapd-dbcachesize and nsslapd-cachememsize in the appropriate range. INFO  tests.suites.config.autotuning_test:autotuning_test.py:308 nsslapd-dbcachesize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:309 nsslapd-cachememsize == b'33333333' INFO  tests.suites.config.autotuning_test:autotuning_test.py:310 nsslapd-dncachememsize == b'67108864' INFO  tests.suites.config.autotuning_test:autotuning_test.py:311 nsslapd-cache-autosize == b'0' INFO  tests.suites.config.autotuning_test:autotuning_test.py:312 nsslapd-cache-autosize-split == b'40'
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[-2] 0.28
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to -2 INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to -2
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[102] 0.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to 102 INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to 102
Passed suites/config/autotuning_test.py::test_cache_autosize_invalid_values[invalid] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.autotuning_test:autotuning_test.py:346 Set nsslapd-cache-autosize-split to invalid INFO  tests.suites.config.autotuning_test:autotuning_test.py:352 Set nsslapd-cache-autosize to invalid
Passed suites/config/config_test.py::test_maxbersize_repl 16.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d8b3466a-3b20-4f84-b9a8-dc58e28a78dd / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b416ea4a-57d9-4e8f-b587-7a86573a30f4 / got description=d8b3466a-3b20-4f84-b9a8-dc58e28a78dd) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:69 Set nsslapd-maxbersize: 20K to master2 INFO  tests.suites.config.config_test:config_test.py:74 Try to add attribute with a big value to master2 - expect to FAIL INFO  tests.suites.config.config_test:config_test.py:81 Try to add attribute with a big value to master1 - expect to PASS INFO  tests.suites.config.config_test:config_test.py:86 Check if a big value was successfully added to master1 INFO  tests.suites.config.config_test:config_test.py:90 Check if a big value was successfully replicated to master2
Passed suites/config/config_test.py::test_config_listen_backport_size 0.27
No log output captured.
Passed suites/config/config_test.py::test_config_deadlock_policy 0.07
No log output captured.
Passed suites/config/config_test.py::test_defaultnamingcontext 1.86
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:220 Check the attribute nsslapd-defaultnamingcontext is present in cn=config INFO  tests.suites.config.config_test:config_test.py:223 Delete nsslapd-defaultnamingcontext attribute INFO  tests.suites.config.config_test:config_test.py:230 modify nsslapd-defaultnamingcontext with new suffix INFO  tests.suites.config.config_test:config_test.py:233 Add new invalid value at runtime to nsslapd-defaultnamingcontext INFO  tests.suites.config.config_test:config_test.py:237 Modify nsslapd-defaultnamingcontext with blank value INFO  tests.suites.config.config_test:config_test.py:240 Add new suffix when nsslapd-defaultnamingcontext is empty INFO  tests.suites.config.config_test:config_test.py:244 Check the value of the nsslapd-defaultnamingcontext automatically have the new suffix INFO  tests.suites.config.config_test:config_test.py:247 Adding new suffix when nsslapd-defaultnamingcontext is not empty INFO  tests.suites.config.config_test:config_test.py:251 Check the value of the nsslapd-defaultnamingcontext has not changed INFO  tests.suites.config.config_test:config_test.py:254 Remove the newly added suffix and check the values of the attribute is not changed INFO  tests.suites.config.config_test:config_test.py:258 Remove all the suffix at the end
Passed suites/config/config_test.py::test_allow_add_delete_config_attributes 5.12
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:308 Add a new valid attribute at runtime to cn=config INFO  tests.suites.config.config_test:config_test.py:312 Delete nsslapd-listenhost to restore the default value INFO  tests.suites.config.config_test:config_test.py:317 Add new invalid attribute at runtime to cn=config INFO  tests.suites.config.config_test:config_test.py:321 Make sure the invalid attribute is not added
Passed suites/config/config_test.py::test_ignore_virtual_attrs 0.74
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:354 Check the attribute nsslapd-ignore-virtual-attrs is present in cn=config INFO  tests.suites.config.config_test:config_test.py:357 Check the default value of attribute nsslapd-ignore-virtual-attrs should be OFF INFO  tests.suites.config.config_test:config_test.py:360 Set the valid values i.e. on/ON and off/OFF for nsslapd-ignore-virtual-attrs INFO  tests.suites.config.config_test:config_test.py:365 Set invalid value for attribute nsslapd-ignore-virtual-attrs INFO  tests.suites.config.config_test:config_test.py:376 Add cosPointer, cosTemplate and test entry to default suffix, where virtual attribute is postal code INFO  tests.suites.config.config_test:config_test.py:389 Test if virtual attribute i.e. postal code shown in test entry while nsslapd-ignore-virtual-attrs: off INFO  tests.suites.config.config_test:config_test.py:392 Set nsslapd-ignore-virtual-attrs=on INFO  tests.suites.config.config_test:config_test.py:395 Test if virtual attribute i.e. postal code not shown while nsslapd-ignore-virtual-attrs: on
Passed suites/config/config_test.py::test_ndn_cache_enabled 8.52
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.config_test:config_test.py:423 Check the attribute nsslapd-ndn-cache-enabled is present in cn=config INFO  tests.suites.config.config_test:config_test.py:426 Check the attribute nsslapd-ndn-cache-enabled has the default value set as ON INFO  tests.suites.config.config_test:config_test.py:429 Check the attribute nsslapd-ndn-cache-max-size is present in cn=config INFO  tests.suites.config.config_test:config_test.py:435 Ticket#49593 : NDN cache stats should be under the global stats - Implemented in 1.4 INFO  tests.suites.config.config_test:config_test.py:436 Fetch the monitor value according to the ds version INFO  tests.suites.config.config_test:config_test.py:442 Check the backend monitor output for Normalized DN cache statistics, while nsslapd-ndn-cache-enabled is off INFO  tests.suites.config.config_test:config_test.py:448 Check the backend monitor output for Normalized DN cache statistics, while nsslapd-ndn-cache-enabled is on INFO  tests.suites.config.config_test:config_test.py:454 Set invalid value for nsslapd-ndn-cache-enabled INFO  tests.suites.config.config_test:config_test.py:458 Set invalid value for nsslapd-ndn-cache-max-size
Passed suites/config/config_test.py::test_require_index 1.45
No log output captured.
Passed suites/config/config_test.py::test_require_internal_index 5.44
No log output captured.
Passed suites/config/regression_test.py::test_maxbersize_repl 7.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.regression_test:regression_test.py:100 Set nsslapd-errorlog-maxlogsize before nsslapd-errorlog-logmaxdiskspace INFO  tests.suites.config.regression_test:regression_test.py:104 Assert no init_dse_file errors in the error log INFO  tests.suites.config.regression_test:regression_test.py:108 Set nsslapd-errorlog-maxlogsize after nsslapd-errorlog-logmaxdiskspace INFO  tests.suites.config.regression_test:regression_test.py:112 Assert no init_dse_file errors in the error log
Passed suites/config/removed_config_49298_test.py::test_restore_config 3.56
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.removed_config_49298_test:removed_config_49298_test.py:43 /etc/dirsrv/slapd-standalone1
Passed suites/config/removed_config_49298_test.py::test_removed_config 2.46
-------------------------------Captured log call--------------------------------
INFO  tests.suites.config.removed_config_49298_test:removed_config_49298_test.py:72 /etc/dirsrv/slapd-standalone1
Passed suites/cos/cos_test.py::test_positive 0.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/cos/indirect_cos_test.py::test_indirect_cos 1.55
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:113 Add custom schema... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:126 Add test user... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:143 Setup indirect COS...
------------------------------Captured stdout call------------------------------
Successfully created subtree password policy
-------------------------------Captured log call--------------------------------
INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:163 Checking user... INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:60 Create password policy for subtree ou=people,dc=example,dc=com INFO  tests.suites.cos.indirect_cos_test:indirect_cos_test.py:170 Checking user...
Passed suites/disk_monitoring/disk_monitoring_test.py::test_verify_operation_when_disk_monitoring_is_off 4.59
-----------------------------Captured stdout setup------------------------------
Relabeled /var/log/dirsrv/slapd-standalone1 from unconfined_u:object_r:user_tmp_t:s0 to system_u:object_r:dirsrv_var_log_t:s0
-----------------------------Captured stderr setup------------------------------
chown: cannot access '/var/log/dirsrv/slapd-standalone1/*': No such file or directory
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
25+0 records in 25+0 records out 26214400 bytes (26 MB, 25 MiB) copied, 0.0106956 s, 2.5 GB/s dd: error writing '/var/log/dirsrv/slapd-standalone1/foo1': No space left on device 10+0 records in 9+0 records out 10465280 bytes (10 MB, 10 MiB) copied, 0.00470652 s, 2.2 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_free_up_the_disk_space_and_change_ds_config 4.47
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_verify_operation_with_nsslapd_disk_monitoring_logging_critical_off 34.69
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.00477384 s, 2.2 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_on_below_half_of_the_threshold 25.59
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0517873 s, 628 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_setting_nsslapd_disk_monitoring_logging_critical_to_off 3.47
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_off 72.35
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.0109736 s, 956 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_operation_with_nsslapd_disk_monitoring_logging_critical_off_below_half_of_the_threshold 160.73
------------------------------Captured stderr call------------------------------
30+0 records in 30+0 records out 31457280 bytes (31 MB, 30 MiB) copied, 0.0155188 s, 2.0 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_go_straight_below_half_of_the_threshold 107.70
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0370784 s, 877 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_readonly_on_threshold 28.21
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.0118259 s, 887 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_readonly_on_threshold_below_half_of_the_threshold 49.93
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0141035 s, 2.3 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_below_half_of_the_threshold_not_starting_after_shutdown 112.08
------------------------------Captured stderr call------------------------------
31+0 records in 31+0 records out 32505856 bytes (33 MB, 31 MiB) copied, 0.0174398 s, 1.9 GB/s
-------------------------------Captured log call--------------------------------
INFO  lib389:disk_monitoring_test.py:582 Instance start up has failed as expected
Passed suites/disk_monitoring/disk_monitoring_test.py::test_go_straight_below_4kb 18.13
------------------------------Captured stderr call------------------------------
25+0 records in 25+0 records out 26214400 bytes (26 MB, 25 MiB) copied, 0.022006 s, 1.2 GB/s dd: error writing '/var/log/dirsrv/slapd-standalone1/foo1': No space left on device 10+0 records in 9+0 records out 10174464 bytes (10 MB, 9.7 MiB) copied, 0.0108522 s, 938 MB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_threshold_to_overflow_value 0.18
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_threshold_is_reached_to_half 14.35
------------------------------Captured stderr call------------------------------
10+0 records in 10+0 records out 10485760 bytes (10 MB, 10 MiB) copied, 0.00458919 s, 2.3 GB/s
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold--2] 0.01
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-9223372036854775808] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-2047] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-0] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold--1294967296] 0.01
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-threshold-invalid] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-invalid] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-1] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-00] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-525 948] 0.01
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period--10] 0.00
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-logging-critical-oninvalid] 0.18
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period--11] 0.01
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_negagtive_parameterize[nsslapd-disk-monitoring-grace-period-01] 0.15
No log output captured.
Passed suites/disk_monitoring/disk_monitoring_test.py::test_valid_operations_are_permitted 3.86
No log output captured.
Passed suites/disk_monitoring/disk_space_test.py::test_basic 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:disk_space_test.py:37 Check that "partition", "size", "used", "available", "use%" words are present in the string INFO  lib389:disk_space_test.py:41 Check that the sizes are numbers
Passed suites/ds_logs/ds_logs_test.py::test_check_default 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
DEBUG  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:232 on
Passed suites/ds_logs/ds_logs_test.py::test_plugin_set_invalid 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:251 test_plugin_set_invalid - Expect to fail with junk value
Passed suites/ds_logs/ds_logs_test.py::test_log_plugin_on 4.39
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:278 Bug 1273549 - Check access logs for millisecond, when attribute is ON INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:279 perform any ldap operation, which will trigger the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:35 Adding 10 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:283 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:286 parse the access logs
Passed suites/ds_logs/ds_logs_test.py::test_log_plugin_off 12.79
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:318 Bug 1273549 - Check access logs for missing millisecond, when attribute is OFF INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:320 test_log_plugin_off - set the configuration attribute to OFF INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:323 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:326 test_log_plugin_off - delete the previous access logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:35 Adding 10 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:333 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:336 check access log that microseconds are not present
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_server_level_0 4.39
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:364 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:367 Configure access log level to 0 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:371 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:375 Check if access log does not contain internal log of MOD operation INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:381 Check if the other internal operations are not present
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_server_level_4 7.00
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:413 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:416 Configure access log level to 4 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:420 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:425 Check if access log contains internal MOD operation in correct format INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:431 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_260 6.61
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:101 Enable automember plugin INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:105 Enable Referential Integrity plugin INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:109 Set nsslapd-plugin-logging to on INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:112 Restart the server INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:119 Configure access log level to 260 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:85 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:88 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:91 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:471 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:475 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:491 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:505 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:517 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_131076 5.87
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:119 Configure access log level to 131076 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:85 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:88 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:91 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:556 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:560 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:576 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:590 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:602 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_internal_log_level_516 7.33
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:119 Configure access log level to 516 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:85 Renaming user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:88 Delete the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:91 Delete automember entry, org. unit and group for the next test INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:641 Restart the server to flush the logs INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:645 Check the access logs for ADD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:662 Check the access logs for MOD operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:679 Check the access logs for DEL operation of the user INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:694 Check if the other internal operations have the correct format
Passed suites/ds_logs/ds_logs_test.py::test_access_log_truncated_search_message 4.60
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:721 Make a search INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:724 Restart the server to flush the logs
Passed suites/ds_logs/ds_logs_test.py::test_etime_order_of_magnitude 1.78
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:810 add_users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:35 Adding 30 users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:813 search users INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:816 parse the access logs to get the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:824 get the operation start time from the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:828 get the OP number from the SRCH string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:832 get the RESULT string matching the SRCH OP number INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:840 get the operation end time from the RESULT string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:844 get the logged etime for the operation from the RESULT string INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:848 Calculate the ratio between logged etime for the operation and elapsed time from its start time to its end time - should be around 1
Passed suites/ds_logs/ds_logs_test.py::test_log_base_dn_when_invalid_attr_request 3.37
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:199 Disable access log buffering
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:880 delete the previous access logs to get a fresh new one INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:883 Search the default suffix, with invalid '"" ""' attribute request INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:884 A Protocol error exception should be raised, see https://github.com/389ds/389-ds-base/issues/3028 INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:890 Check the access logs for correct messages
Passed suites/ds_logs/ds_logs_test.py::test_audit_log_rotate_and_check_string 12.34
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:936 Doing modifications to rotate audit log INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:941 Doing one more modification just in case INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:948 Check that DS string is present on first line INFO  tests.suites.ds_logs.ds_logs_test:ds_logs_test.py:952 Check that DS string is present only once
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[24576] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[16512] 0.03
No log output captured.
Passed suites/ds_logs/regression_test.py::test_default_loglevel_stripped[16385] 1.08
No log output captured.
Passed suites/ds_logs/regression_test.py::test_dse_config_loglevel_error 14.21
No log output captured.
Passed suites/ds_tools/logpipe_test.py::test_user_permissions 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.ds_tools.logpipe_test:logpipe_test.py:32 Add system test user - dirsrv_testuser
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_tools.logpipe_test:logpipe_test.py:68 Try to create a logpipe in the log directory with "-u" option specifying the user
Passed suites/ds_tools/replcheck_test.py::test_state 0.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 16d2e9d6-1d36-49fd-9ba1-790f03b03d03 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b70c10dd-6466-4b24-84ad-50216524621c / got description=16d2e9d6-1d36-49fd-9ba1-790f03b03d03) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 3b96e60d-58d0-48fc-a05a-f5d895616593 / got description=b70c10dd-6466-4b24-84ad-50216524621c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  tests.suites.ds_tools.replcheck_test:replcheck_test.py:101 Export LDAPTLS_CACERTDIR env variable for ds-replcheck INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 9e6f1e23-9ccb-46f8-bf10-4d3f1dddec26 / got description=3b96e60d-58d0-48fc-a05a-f5d895616593) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect a2e49222-1b82-4210-b258-1a7d7f76ac76 / got description=9e6f1e23-9ccb-46f8-bf10-4d3f1dddec26) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working
Passed suites/ds_tools/replcheck_test.py::test_check_ruv 18.42
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_missing_entries 19.71
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_tombstones 20.55
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_conflict_entries 29.07
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_inconsistencies 23.94
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_suffix_exists 0.36
No log output captured.
Passed suites/ds_tools/replcheck_test.py::test_check_missing_tombstones 17.42
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/export_master1.ldif ldiffile: /tmp/export_master2.ldif
Passed suites/ds_tools/replcheck_test.py::test_dsreplcheck_with_password_file 0.01
No log output captured.
Passed suites/ds_tools/replcheck_test.py::test_dsreplcheck_timeout_connection_mechanisms 1.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.ds_tools.replcheck_test:replcheck_test.py:535 Run ds-replcheck with -t option
Passed suites/dynamic_plugins/dynamic_plugins_test.py::test_acceptance 39.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5c65671e-5c3f-455c-b415-7bd0c2d885c3 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 93aaf02b-27fe-47bb-8a29-2259d1b3e195 / got description=5c65671e-5c3f-455c-b415-7bd0c2d885c3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4f7583bd-69b9-4303-a029-5a19e5b70790 / got description=93aaf02b-27fe-47bb-8a29-2259d1b3e195) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7e8d3445-e3b0-49d2-a17b-ef565df627be / got description=4f7583bd-69b9-4303-a029-5a19e5b70790) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/dynamic_plugins/dynamic_plugins_test.py::test_memory_corruption 43.75
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 500409e3-f115-4c7c-9cdf-f7a4444729ad / got description=7e8d3445-e3b0-49d2-a17b-ef565df627be) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7f51879c-1c75-4004-ae02-f572e05c9562 / got description=500409e3-f115-4c7c-9cdf-f7a4444729ad) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/dynamic_plugins/dynamic_plugins_test.py::test_stress 686.93
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7332d741-cc7b-4864-be34-7c094783b268 / got description=7f51879c-1c75-4004-ae02-f572e05c9562) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect fa0be60f-450a-483d-8124-d6b123d768cc / got description=7332d741-cc7b-4864-be34-7c094783b268) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/dynamic_plugins/notice_for_restart_test.py::test_notice_when_dynamic_not_enabled 4.74
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/export/export_test.py::test_dbtasks_db2ldif_with_non_accessible_ldif_file_path 4.76
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:63 Stopping the instance... INFO  lib389.utils:export_test.py:66 Performing an offline export to a non accessible ldif file path - should fail properly CRITICAL LogCapture:dbtasks.py:40 db2ldif failed INFO  lib389.utils:export_test.py:33 checking output msg INFO  lib389.utils:export_test.py:38 Clear the log INFO  lib389.utils:export_test.py:70 parsing the errors log to search for the error reported INFO  lib389.utils:export_test.py:79 Restarting the instance...
Passed suites/export/export_test.py::test_db2ldif_cli_with_non_accessible_ldif_file_path 5.24
------------------------------Captured stdout call------------------------------
db2ldif failed
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/nonexistent/export.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:export_test.py:104 Stopping the instance... INFO  lib389.utils:export_test.py:107 Performing an offline export to a non accessible ldif file path - should fail properly INFO  lib389.utils:export_test.py:115 db2ldif failed properly: error (1) INFO  lib389.utils:export_test.py:118 parsing the errors log to search for the error reported INFO  lib389.utils:export_test.py:121 error string : '[30/Oct/2020:21:15:46.100415957 -0400] - ERR - bdb_db2ldif - db2ldif: userRoot: can\'t open /tmp/nonexistent/export.ldif: 2 (No such file or directory) while running as user "dirsrv"\n', '[30/Oct/2020:21:15:51.416205048 -0400] - ERR - bdb_db2ldif - db2ldif: userRoot: can\'t open /tmp/nonexistent/export.ldif: 2 (No such file or directory) while running as user "dirsrv"\n' INFO  lib389.utils:export_test.py:123 Restarting the instance...
Passed suites/filter/basic_filter_test.py::test_search_attr 0.34
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/bitw_filter_test.py::test_bitwise_plugin_status 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/bitw_filter_test.py::test_search_disabled_accounts 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_plugin_can_be_disabled 5.10
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_plugin_is_disabled 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_enabling_works_fine 4.55
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=513))-1] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=16777216))-1] 0.16
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=8388608))-1] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=5))-3] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=8))-3] 0.13
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=7))-5] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testerperson) (testUserAccountControl:1.2.840.113556.1.4.804:=7))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.803:=98536)(testUserAccountControl:1.2.840.113556.1.4.803:=912)))-0] 0.15
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.804:=87)(testUserAccountControl:1.2.840.113556.1.4.804:=91)))-8] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (&(testUserAccountControl:1.2.840.113556.1.4.803:=89536)(testUserAccountControl:1.2.840.113556.1.4.804:=79)))-1] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (|(testUserAccountControl:1.2.840.113556.1.4.803:=89536)(testUserAccountControl:1.2.840.113556.1.4.804:=79)))-8] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (|(testUserAccountControl:1.2.840.113556.1.4.803:=89)(testUserAccountControl:1.2.840.113556.1.4.803:=536)))-0] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=x))-13] 0.15
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=&\*#$%))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-65536))-0] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-1))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=-))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=\*))-13] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.804:=\*))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=6552))-0] 0.01
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson\))(testUserAccountControl:1.2.840.113556.1.4.804:=6552))-0] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_all_together[(& (objectclass=testperson) (testUserAccountControl:1.2.840.113556.1.4.803:=65536))-5] 0.00
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries 0.22
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries1 0.21
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries3 0.02
No log output captured.
Passed suites/filter/bitw_filter_test.py::test_5_entries4 0.02
No log output captured.
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(sn=last1)(givenname=first1))-1] 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(sn=last1)(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(sn=last1)(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(sn=last1)(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=first1))))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(sn=last3)(givenname=*))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=*)(sn=last3)(givenname=*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(&(sn=last3)(givenname=*)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=*)(&(sn=last3)(givenname=*)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid5)(&(&(sn=*))(&(givenname=*))))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid5)(&(&(sn=*))(&(givenname=*))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(objectclass=*)(uid=*)(sn=last*))-5] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(objectclass=*)(uid=*)(sn=last*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(objectclass=*)(uid=*)(sn=last1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(objectclass=*)(uid=*)(sn=last1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(sn=last1)(givenname=first1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=uid1)(sn=last1)(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(|(sn=last1)(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=uid1)(|(sn=last1)(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=uid1)(|(|(sn=last1))(|(givenname=first1))))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=uid1)(|(|(sn=last1))(|(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(objectclass=*)(sn=last1)(|(givenname=first1)))-18] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(objectclass=*)(sn=last1)(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last1))(|(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(objectclass=*)(sn=last1))(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last))(|(givenname=first1)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(objectclass=*)(sn=last))(|(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(!(cn=NULL)))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(!(cn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(!(cn=NULL))(uid=uid1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(!(cn=NULL))(uid=uid1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(&(!(uid=1))(!(givenname=first1))))-5] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=*)(&(!(uid=1))(!(givenname=first1))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(uid=NULL))(sn=last1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(|(uid=uid1)(uid=NULL))(sn=last1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(uid=NULL))(!(sn=NULL)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(|(uid=uid1)(uid=NULL))(!(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(|(uid=uid1)(sn=last2))(givenname=first1))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(|(uid=uid1)(sn=last2))(givenname=first1))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid=uid1)(!(uid=NULL)))(sn=last2))-2] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(uid=uid1)(!(uid=NULL)))(sn=last2))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid=uid1)(uid=NULL))(sn=last2))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(uid=uid1)(uid=NULL))(sn=last2))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid5)(sn=*)(cn=*)(givenname=*)(uid=u*)(sn=la*)(cn=full*)(givenname=f*)(uid>=u)(!(givenname=NULL)))-1] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid5)(sn=*)(cn=*)(givenname=*)(uid=u*)(sn=la*)(cn=full*)(givenname=f*)(uid>=u)(!(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(objectclass=*)(sn=last))(&(givenname=first1)))-1] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(objectclass=*)(sn=last))(&(givenname=first1)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(sn=last1)(givenname=NULL))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(sn=last1)(givenname=NULL))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(sn=last1)(givenname=NULL)))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(sn=last1)(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL))))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL)(sn=*)))(|(sn=NULL)))-0] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(&(sn=last1))(&(givenname=NULL)(sn=*)))(|(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=uid1)(&(&(sn=last*))(&(givenname=first*)))(&(sn=NULL)))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=uid1)(&(&(sn=last*))(&(givenname=first*)))(&(sn=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(sn=NULL)(givenname=NULL))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=NULL)(sn=NULL)(givenname=NULL))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(|(sn=NULL)(givenname=NULL)))-0] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=NULL)(|(sn=NULL)(givenname=NULL)))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=NULL)(|(|(sn=NULL))(|(givenname=NULL))))-0] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=NULL)(|(|(sn=NULL))(|(givenname=NULL))))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*))-6] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*)(uid=*))"...
Passed suites/filter/complex_filters_test.py::test_filters[(uid>=uid3)-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(uid>=uid3)"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid=*)(uid>=uid3))-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid=*)(uid>=uid3))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(uid>=uid3)(uid<=uid5))-6] 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(uid>=uid3)(uid<=uid5))"...
Passed suites/filter/complex_filters_test.py::test_filters[(&(uid>=uid3)(uid<=uid5))-3] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(&(uid>=uid3)(uid<=uid5))"...
Passed suites/filter/complex_filters_test.py::test_filters[(|(&(uid>=uid3)(uid<=uid5))(uid=*))-6] 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.complex_filters_test:complex_filters_test.py:130 Testing filter "(|(&(uid>=uid3)(uid<=uid5))(uid=*))"...
Passed suites/filter/filter_cert_test.py::test_positive 26.38
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index0] 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index2] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index3] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index4] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index5] 0.26
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index7] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index8] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index9] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index10] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index11] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index12] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index13] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index14] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index15] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index16] 0.02
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_valid_invalid_attributes[index17] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod0] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod1] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod3] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod4] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod5] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod6] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod9] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod10] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod11] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod12] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod13] 0.06
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod14] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod15] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod16] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods[mod17] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode0] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode1] 0.05
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode3] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode4] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode5] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode6] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode9] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode10] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode11] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode12] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode13] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode14] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode15] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode16] 0.28
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_replace[mode17] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode0] 0.07
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode1] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode2] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode3] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode4] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode5] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode6] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode7] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode8] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode9] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode10] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode11] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode12] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode13] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode14] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode15] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode16] 0.04
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_mods_delete[mode17] 0.03
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_search_positive_negative 0.13
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrbitStringMatch:bitStringMatch:='0001'B)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match:caseExactIA5Match:=Sprain)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch:caseExactOrderingMatch:=ÇélIné Ändrè)-5] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeMatch:=20100218171300Z)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeOrderingMatch:=20100218171300Z)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrbooleanMatch:booleanMatch:=TRUE)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match:caseIgnoreIA5Match:=sprain1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch:caseIgnoreMatch:=ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch:caseIgnoreOrderingMatch:=ÇélIné Ändrè1)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch:caseIgnoreListMatch:=foo1$bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrobjectIdentifierMatch:objectIdentifierMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrdirectoryStringFirstComponentMatch:directoryStringFirstComponentMatch:=ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrobjectIdentifierFirstComponentMatch:objectIdentifierFirstComponentMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.14
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrdistinguishedNameMatch:distinguishedNameMatch:=cn=foo1,cn=bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerMatch:integerMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerMatch:integerOrderingMatch:=-2)-6] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrintegerFirstComponentMatch:integerFirstComponentMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attruniqueMemberMatch:uniqueMemberMatch:=cn=foo1,cn=bar#'0001'B)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-10] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-11] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch:telephoneNumberMatch:=+1 408 555 4798)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attroctetStringMatch:octetStringMatch:=AAAAAAAAAAAAAAE=)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attroctetStringMatch:octetStringOrderingMatch:=AAAAAAAAAAAAAAE=)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=ÇélIné Ändrè*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactMatch=*é Ä*)-5] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*Sprain)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseExactIA5Match=*rai*)-3] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*sprain1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=sprain1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*sprain1)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreIA5Match=*rai*)-6] 0.15
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=ÇélIné Ändrè1*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreMatch=*é Ä*)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=foo1$bar*)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrcaseIgnoreListMatch=*1$b*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*00001*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=00001*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*00001)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrnumericStringMatch=*000*)-6] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=+1 408 555 4798*)-1] 0.00
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798)-1] 0.01
No log output captured.
Passed suites/filter/filter_index_match_test.py::test_do_extensible_search[(attrtelephoneNumberMatch=* 55*)-6] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=people))(|(ou=nothing2)(ou=nothing3)))] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=people)(ou=nothing1))(|(ou=nothing2)(ou=nothing3)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=nothing2))(|(ou=people)(ou=nothing3)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(|(|(ou=nothing1)(ou=nothing2))(|(ou=nothing3)(ou=people)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn<=0000000000000000)(givenname>=FFFFFFFFFFFFFFFF))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn>=0000000000000000)(sn<=1111111111111111))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_positive[(&(sn>=0000000000000000)(givenname<=FFFFFFFFFFFFFFFF))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_schema 3.58
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(MYINTATTR=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(uidNumber=*)(!(uidNumber=18446744073709551617)))] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(gidNumber=*)(!(gidNumber=18446744073709551617)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(uidNumber=*)(!(gidNumber=18446744073709551617)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(&(myintattr=*)(!(myintattr=18446744073709551617)))] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber>=-18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber>=-18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(uidNumber<=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(gidNumber<=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing[(myintattr<=18446744073709551617)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber=54321)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber=54321)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr=54321)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber<=-999999999999999999999999999999)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber<=-999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr<=-999999999999999999999999999999)] 0.01
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(gidNumber>=999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(uidNumber>=999999999999999999999999999999)] 0.00
No log output captured.
Passed suites/filter/filter_indexing_test.py::test_indexing_negative[(myintattr>=999999999999999999999999999999)] 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_eq 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_logic_test.py::test_sub 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_not_eq 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_ranges 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_range 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_allid_shortcut 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_not_eq 0.14
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_not_eq 0.00
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_range 0.03
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_range 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_and_eq 0.01
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_or_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_and_or_eq 0.02
No log output captured.
Passed suites/filter/filter_logic_test.py::test_or_and_eq 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_matching_rules 0.12
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_match_test.py::test_add_attribute_types 1.32
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule0] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule1] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule2] 0.26
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule3] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule4] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule5] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule6] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule7] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule8] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule9] 0.26
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule10] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule11] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule12] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule13] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule14] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule15] 0.26
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule16] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_attributes[rule17] 0.02
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode0] 0.04
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode1] 0.04
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode2] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode3] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode4] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode5] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode6] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode7] 0.27
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode8] 0.04
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode9] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode10] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode11] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode12] 0.04
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode13] 0.27
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode14] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode15] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode16] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_modes[mode17] 0.03
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode0] 0.20
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode1] 0.07
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode2] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode3] 0.30
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode4] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode5] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode6] 0.29
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode7] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode8] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode9] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode10] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode11] 0.07
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode12] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode13] 0.30
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode14] 0.29
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode15] 0.05
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode16] 0.06
No log output captured.
Passed suites/filter/filter_match_test.py::test_valid_invalid_mode_replace[mode17] 0.07
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrbitStringMatch='0001'B)-1-(attrbitStringMatch:bitStringMatch:='000100000'B)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrgeneralizedTimeMatch=20100218171300Z)-1-(attrcaseExactIA5Match=SPRAIN)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactMatch>=ÇélIné Ändrè)-5-(attrcaseExactMatch=ÇéLINé ÄNDRè)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1-(attrcaseExactMatch>=çéliné ändrè)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseExactIA5Match=Sprain)-1-(attrgeneralizedTimeMatch=20300218171300Z)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrbooleanMatch=TRUE)-1-(attrgeneralizedTimeMatch>=20300218171300Z)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreIA5Match=sprain1)-1-(attrcaseIgnoreIA5Match=sprain9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreMatch=ÇélIné Ändrè1)-1-(attrcaseIgnoreMatch=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreMatch>=ÇélIné Ändrè1)-6-(attrcaseIgnoreMatch>=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrcaseIgnoreListMatch=foo1$bar)-1-(attrcaseIgnoreListMatch=foo1$bar$baz$biff)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrobjectIdentifierMatch=1.3.6.1.4.1.1466.115.121.1.15)-1-(attrobjectIdentifierMatch=1.3.6.1.4.1.1466.115.121.1.15.99999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrgeneralizedTimeMatch>=20100218171300Z)-6-(attroctetStringMatch>=AAAAAAAAAAABAQQ=)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrdirectoryStringFirstComponentMatch=ÇélIné Ändrè1)-1-(attrdirectoryStringFirstComponentMatch=ÇélIné Ändrè9999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrobjectIdentifierFirstComponentMatch=1.3.6.1.4.1.1466.115.121.1.15)-1-(attrobjectIdentifierFirstComponentMatch=1.3.6.1.4.1.1466.115.121.1.15.99999)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrdistinguishedNameMatch=cn=foo1,cn=bar)-1-(attrdistinguishedNameMatch=cn=foo1,cn=bar,cn=baz)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerMatch=-2)-1-(attrintegerMatch=-20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerMatch>=-2)-6-(attrintegerMatch>=20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrintegerFirstComponentMatch=-2)-1-(attrintegerFirstComponentMatch=-20)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attruniqueMemberMatch=cn=foo1,cn=bar#'0001'B)-1-(attruniqueMemberMatch=cn=foo1,cn=bar#'00010000'B)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrnumericStringMatch=00001)-1-(attrnumericStringMatch=000000001)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrnumericStringMatch>=00001)-6-(attrnumericStringMatch>=01)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attrtelephoneNumberMatch=+1 408 555 4798)-1-(attrtelephoneNumberMatch=+2 408 555 4798)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attroctetStringMatch=AAAAAAAAAAAAAAE=)-1-(attroctetStringMatch=AAAAAAAAAAAAAAEB)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_match_count[(attroctetStringMatch>=AAAAAAAAAAAAAAE=)-6-(attroctetStringMatch>=AAAAAAAAAAABAQE=)] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrbitStringMatch:bitStringMatch:='0001'B)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match:caseExactIA5Match:=Sprain)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch:caseExactMatch:=ÇélIné Ändrè)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch:caseExactOrderingMatch:=ÇélIné Ändrè)-5] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeMatch:=20100218171300Z)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrgeneralizedTimeMatch:generalizedTimeOrderingMatch:=20100218171300Z)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrbooleanMatch:booleanMatch:=TRUE)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match:caseIgnoreIA5Match:=sprain1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch:caseIgnoreMatch:=ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch:caseIgnoreOrderingMatch:=ÇélIné Ändrè1)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch:caseIgnoreListMatch:=foo1$bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrobjectIdentifierMatch:objectIdentifierMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrdirectoryStringFirstComponentMatch:directoryStringFirstComponentMatch:=ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrobjectIdentifierFirstComponentMatch:objectIdentifierFirstComponentMatch:=1.3.6.1.4.1.1466.115.121.1.15)-1] 0.15
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrdistinguishedNameMatch:distinguishedNameMatch:=cn=foo1,cn=bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerMatch:integerMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerMatch:integerOrderingMatch:=-2)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrintegerFirstComponentMatch:integerFirstComponentMatch:=-2)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attruniqueMemberMatch:uniqueMemberMatch:=cn=foo1,cn=bar#'0001'B)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-10] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch:numericStringMatch:=00001)-11] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch:telephoneNumberMatch:=+1 408 555 4798)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attroctetStringMatch:octetStringMatch:=AAAAAAAAAAAAAAE=)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attroctetStringMatch:octetStringOrderingMatch:=AAAAAAAAAAAAAAE=)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=ÇélIné Ändrè*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*ÇélIné Ändrè)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactMatch=*é Ä*)-5] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=Sprain*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*Sprain)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseExactIA5Match=*rai*)-3] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*sprain1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=sprain1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*sprain1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreIA5Match=*rai*)-6] 0.15
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=ÇélIné Ändrè1*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*ÇélIné Ändrè1)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreMatch=*é Ä*)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=foo1$bar*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*foo1$bar)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrcaseIgnoreListMatch=*1$b*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*00001*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=00001*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*00001)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrnumericStringMatch=*000*)-6] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798*)-1] 0.01
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=+1 408 555 4798*)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=*+1 408 555 4798)-1] 0.00
No log output captured.
Passed suites/filter/filter_match_test.py::test_extensible_search[(attrtelephoneNumberMatch=* 55*)-6] 0.01
No log output captured.
Passed suites/filter/filter_test.py::test_filter_escaped 0.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:42 Running test_filter_escaped... INFO  tests.suites.filter.filter_test:filter_test.py:78 test_filter_escaped: PASSED
Passed suites/filter/filter_test.py::test_filter_search_original_attrs 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:95 Running test_filter_search_original_attrs... INFO  tests.suites.filter.filter_test:filter_test.py:108 test_filter_search_original_attrs: PASSED
Passed suites/filter/filter_test.py::test_filter_scope_one 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:125 Search user using ldapsearch with scope one INFO  tests.suites.filter.filter_test:filter_test.py:127 [dn: ou=services,dc=example,dc=com ou: services ] INFO  tests.suites.filter.filter_test:filter_test.py:129 Search should only have one entry
Passed suites/filter/filter_test.py::test_filter_with_attribute_subtype 0.21
-------------------------------Captured log call--------------------------------
INFO  lib389:filter_test.py:154 Bind as cn=Directory Manager INFO  lib389:filter_test.py:161 ######################### ADD ###################### INFO  lib389:filter_test.py:184 Try to add Add cn=test_entry both, dc=example,dc=com: dn: cn=test_entry both, dc=example,dc=com cn: test_entry both cn;en: test_entry en cn;fr: test_entry fr objectclass: top objectclass: person sn: test_entry both INFO  lib389:filter_test.py:187 Try to add Add cn=test_entry en only, dc=example,dc=com: dn: cn=test_entry en only, dc=example,dc=com cn: test_entry en only cn;en: test_entry en objectclass: top objectclass: person sn: test_entry en only INFO  lib389:filter_test.py:190 ######################### SEARCH ###################### INFO  lib389:filter_test.py:194 Try to search with filter (&(sn=test_entry en only)(!(cn=test_entry fr))) INFO  lib389:filter_test.py:198 Found cn=test_entry en only,dc=example,dc=com INFO  lib389:filter_test.py:202 Try to search with filter (&(sn=test_entry en only)(!(cn;fr=test_entry fr))) INFO  lib389:filter_test.py:206 Found cn=test_entry en only,dc=example,dc=com INFO  lib389:filter_test.py:210 Try to search with filter (&(sn=test_entry en only)(!(cn;en=test_entry en))) INFO  lib389:filter_test.py:213 Found none INFO  lib389:filter_test.py:215 ######################### DELETE ###################### INFO  lib389:filter_test.py:217 Try to delete cn=test_entry both, dc=example,dc=com INFO  lib389:filter_test.py:220 Try to delete cn=test_entry en only, dc=example,dc=com INFO  tests.suites.filter.filter_test:filter_test.py:223 Testcase PASSED
Passed suites/filter/filter_test.py::test_extended_search 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.filter.filter_test:filter_test.py:250 Running test_filter_escaped... INFO  lib389:filter_test.py:267 Try to search with filter (cn:de:=ext-test-entry) INFO  lib389:filter_test.py:273 Try to search with filter (cn:caseIgnoreIA5Match:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:279 Try to search with filter (cn:caseIgnoreMatch:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:285 Try to search with filter (cn:caseExactMatch:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:291 Try to search with filter (cn:caseExactMatch:=ext-test-entry) INFO  lib389:filter_test.py:297 Try to search with filter (cn:caseExactIA5Match:=EXT-TEST-ENTRY) INFO  lib389:filter_test.py:303 Try to search with filter (cn:caseExactIA5Match:=ext-test-entry)
Passed suites/filter/filter_with_non_root_user_test.py::test_telephone[(telephonenumber=*7393)] 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filter_with_non_root_user_test.py::test_telephone[(telephonenumber=*408*3)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=mward)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=sunnyvale)0] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(mail=jreu*)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(mail=*exam*)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=*)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1:=>AAA)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:es:=>AAA)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.5:=AAA)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1:=>user100)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:es:=>user100)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.5:=user100)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.1:=user1)] 0.28
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.15.1.1:=z)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=user1)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid<=Z)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=1)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=A)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid>=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.2:=user20)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.2:=z)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=A)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=A)] 0.35
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.4:=A)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1.4:=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:2.16.840.1.113730.3.3.2.18.1:=>=z)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid:fr:=>=z)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(description=This is the special * attribute value)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(description=*x*)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=ptyler)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=*wal*)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=0312)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(uid=mw*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=2295)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=Cupertino)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(objectclass=inetorgperson)] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=sunnyvale)1] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=200)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=201)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(roomNumber=202)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(l=*)] 0.29
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(sn~=tiller))(!(uid=ptyler)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(sn~=tiller)) (uid=ptyler))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (roomNumber=0312))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=*wal*))(!(roomNumber=0312)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=*wal*))(roomNumber=0312))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(!(roomNumber=0312)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*)(|(sn~=tiller) (roomNumber=2295)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(&(uid=*wal*) (roomNumber=2295))(&(uid=*wal*) (sn~=tiller)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))0] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(|(uid=*wal*) (sn~=tiller))(|(uid=*wal*) (roomNumber=2295)))0] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*) (roomNumber=2295))0] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(roomNumber=2295) (uid=*wal*))0] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(roomNumber=2295) (uid=*wal*))0] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))1] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (roomNumber=2295))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(uid=*wal*) (l=*))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino) (|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino)(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295))))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(|(uid=*wal*)(&(sn~=tiller) (roomNumber=2295)))))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user1))(objectclass=inetorgperson))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user1))(objectclass=inetorgperson))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(mail=cnewport@example.com))(l=sunnyvale))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(|(uid=*wal*) (sn~=tiller))(|(uid=*wal*) (roomNumber=2295)))1] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(uid=*wal*) (roomNumber=2295))1] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(roomNumber=2295) (uid=*wal*))1] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(roomNumber=2295) (uid=*wal*))1] 0.13
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(roomNumber=2254))(&(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=2254))(&(!(roomNumber=200))(!(roomNumber=201))(!(roomNumber=202))(l=sunnyvale)))] 0.06
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.06
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(& (objectclass=inetorgperson)(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.06
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(| (objectclass=inetorgperson)(!(uid=user1))(!(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20))(!(uid=user30)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=sunnyvale)(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=sunnyvale)(!(roomNumber=4012))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(!(|(!(l=*))(!(l=sunnyvale))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=*))(!(l=sunnyvale)))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(!(l=Cupertino))(mail=*exam*) (|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (mail=*exam*) (|(uid=*wal*) (l=*)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(!(l=Cupertino))(mail=*exam*) (|(uid=*wal*) (l=*)))] 0.05
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino)(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(|(l=Cupertino) (mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.04
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_positive[(&(l=Cupertino)(!(mail=*exam*))(|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(userpassword=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(fred=*)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<1)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1.1:=1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<user1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<user1)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.15.1:=<z)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:es:=<z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid<=1)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid<=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid>=Z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=A)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=A)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1.2:=A)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=user20)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=user20)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1:=<=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:fr:=<=z)] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid:2.16.840.1.113730.3.3.2.18.1.4:=z)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(sn~=tiller)] 0.27
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(givenName~=pricella)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(mail=cnewport@example.com)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user20)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user30)] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(uid=user40)] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(sn~=tiller) (givenName~=pricella))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(sn~=tiller)(!(uid=ptyler)))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(!(l=Cupertino))(mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino)(!(mail=*exam*))(!(|(uid=*wal*) (l=*))))] 0.03
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino) (mail=*exam*)(!(|(uid=*wal*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/filter_with_non_root_user_test.py::test_all_negative[(&(l=Cupertino) (mail=*exam*) (|(uid=*wal*) (l=*)))] 0.02
No log output captured.
Passed suites/filter/filterscanlimit_test.py::test_invalid_configuration 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/filterscanlimit_test.py::test_idlistscanlimit 9.67
No log output captured.
Passed suites/filter/large_filter_test.py::test_large_filter[(&(objectClass=person)(|(manager=uid=fmcdonnagh,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_0,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_1,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_2,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_3,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_4,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_5,dc=anuj,dc=com)(manager=uid=jvedder, dc=anuj, dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_6,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_7,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_8,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_9,dc=anuj,dc=com)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_10,dc=anuj,dc=com)(manager=uid=cnewport, dc=anuj, dc=com)))] 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/large_filter_test.py::test_large_filter[(&(objectClass=person)(|(manager=uid=fmcdonnagh *)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_0,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_1,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_2,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_3,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_4,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_5,*)(manager=uid=jvedder,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_6,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_7,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_8,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_9,*)(manager=cn=no_such_entry_with_a_really_long_dn_component_to_stress_the_filter_handling_code_10,*)(manager=uid=cnewport,*)))] 0.27
No log output captured.
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_supported_features 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-] 0.19
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-False-oper_attr_list0-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-] 0.04
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[-True-oper_attr_list1-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-False-oper_attr_list2-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-] 0.04
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[ou=people,dc=example,dc=com-True-oper_attr_list3-objectClass] 0.25
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-False-oper_attr_list4-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-] 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[uid=all_attrs_test,ou=people,dc=example,dc=com-True-oper_attr_list5-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:158 bound as: uid=all_attrs_test,ou=people,dc=example,dc=com
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-] 0.04
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-*] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/rfc3673_all_oper_attrs_test.py::test_search_basic[cn=config-False-oper_attr_list6-objectClass] 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:rfc3673_all_oper_attrs_test.py:161 bound as: cn=Directory Manager
Passed suites/filter/schema_validation_test.py::test_filter_validation_config 0.58
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/schema_validation_test.py::test_filter_validation_enabled 3.88
No log output captured.
Passed suites/filter/schema_validation_test.py::test_filter_validation_warn_safe 0.40
No log output captured.
Passed suites/filter/schema_validation_test.py::test_filter_validation_warn_unsafe 0.52
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition0-cn] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition1-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition2-cn] 0.02
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition3-cn] 0.26
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition4-modifiersName] 0.25
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition5-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition6-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition7-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition8-modifiersName] 0.02
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition9-modifyTimestamp] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition10-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition11-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition12-modifiersName] 0.25
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition13-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition14-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition15-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (| (nsRoleDN=cn=new managed role) (sn=Hall)) (l=sunnyvale))-condition16-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition17-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition18-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition19-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition20-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(uid=rjense2)-condition21-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(nsRoleDN=cn=new managed *)-condition22-cn] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(nsRoleDN=cn=new managed *)-condition23-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition24-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition25-nsRoleDN] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition26-mailquota] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition27-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition28-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition29-nsRoleDN] 0.00
No log output captured.
Passed suites/filter/vfilter_attribute_test.py::test_all_together_positive[(& (nsRoleDN=cn=new managed *) (uid=mtyler))-condition30-modifiersName] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode=99999)] 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalAddress=345 California Av., Mountain View, CA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode:2.16.840.1.113730.3.3.2.7.1:=88888)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalCode:2.16.840.1.113730.3.3.2.7.1.3:=66666)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass=vpe*)] 0.14
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass=*emai*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota=*00)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota=*6*0)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(nsRole=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(postalAddress=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1:=>AAA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:es:=>AAA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1.5:=AAA)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:2.16.840.1.113730.3.3.2.15.1:=>vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(emailclass:es:=>vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.15.1.1:=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota<=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota>=600)] 0.15
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(nsRole~=cn=new)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=*wal*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=mw*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(roomNumber=0312)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=Cupertino)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(uid=user1)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(objectclass=inetorgperson)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=sunnyvale)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(roomNumber=3924)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(l=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(objectclass=*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota<=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota>=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=900)] 0.14
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=600)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1:=>=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:fr:=>=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole~=cn=new managed))(!(nsRole=cn=new vaddr filtered role,dc=example,dc=com)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*) (nsRole=cn=*another*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=*wal*))(!(nsRole=cn=*another*)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=*wal*))(nsRole=cn=*another*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(!(nsRole=cn=*another*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(uid=*wal*)(|(nsRole~=cn=new managed) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(&(uid=*wal*) (l=Cupertino))(&(uid=*wal*) (nsRole~=cn=new managed)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(&(nsRole~=cn=new managed) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(|(uid=*wal*) (nsRole~=cn=new managed))(|(uid=*wal*) (l=Cupertino)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(nsRole=cn=*vaddr*) (uid=*wal*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))] 0.14
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(uid=*wal*) (nsRole=cn=*vaddr*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(nsRole=cn=*vaddr*) (l=*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino) (|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino)(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(|(uid=*wal*)(&(nsRole~=cn=new managed) (nsRole=cn=*vaddr*)))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(objectclass=inetorgperson))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(objectclass=inetorgperson))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=*vaddr*))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=*vaddr*))(l=sunnyvale))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson))] 0.14
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user40))(&(!(uid=user1))(!(uid=user20))(!(uid=user30))(objectclass=inetorgperson)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(roomNumber=2254))(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(roomNumber=2254))(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508))(l=sunnyvale)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.12
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(& (objectclass=inetorgperson)(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(| (objectclass=inetorgperson)(!(uid=user1))(!(uid=user20))(!(uid=user30)))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=sunnyvale)(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=sunnyvale)(!(nsRole=cn=another vaddr role,dc=example,dc=com))(!(roomNumber=3924))(!(roomNumber=4508)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(!(|(!(l=*))(!(l=sunnyvale))))] 0.14
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=*))(!(l=sunnyvale)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino) (emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(!(l=Cupertino))(emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(&(l=Cupertino)(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.15
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(!(l=Cupertino))(emailclass=*emai*) (|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.02
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino)(!(emailclass=*emai*))(|(nsRole=cn=*vaddr*) (l=*)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_positive[(|(l=Cupertino) (emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(postalCode:de:==77777)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(fred=*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.5:=vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1:=<1)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:es:=<1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.1:=1)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1:=<vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:es:=<vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(emailclass:2.16.840.1.113730.3.3.2.15.1.1:=vpemail)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.15.1:=<900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:es:=<900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota<=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota>=900)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole~=cn=new managed)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=new vaddr filtered role,dc=example,dc=com)] 0.16
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=*another*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=*vaddr*)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user20)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user30)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(nsRole=cn=another vaddr role,dc=example,dc=com)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(roomNumber=4508)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(uid=user40)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(roomNumber=2254)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1.2:=100)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=600)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=600)] 0.00
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1:=<=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:fr:=<=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(mailquota:2.16.840.1.113730.3.3.2.18.1.4:=900)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(Description=This is the special \2a attribute value)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(Description=*\2a*)] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole~=cn=new managed) (nsRole=cn=new vaddr filtered role,dc=example,dc=com))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(!(nsRole~=cn=new managed)) (nsRole=cn=new vaddr filtered role,dc=example,dc=com))] 0.15
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole~=cn=new managed)(!(nsRole=cn=new vaddr filtered role,dc=example,dc=com)))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(uid=*wal*) (nsRole=cn=*vaddr*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(nsRole=cn=*vaddr*) (uid=*wal*))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(!(l=Cupertino))(emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(l=Cupertino)(!(emailclass=*emai*))(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/filter/vfilter_simple_test.py::test_param_negative[(&(l=Cupertino) (emailclass=*emai*)(!(|(nsRole=cn=*vaddr*) (l=*))))] 0.01
No log output captured.
Passed suites/fourwaymmr/fourwaymmr_test.py::test_verify_trees 3.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 812c2126-fa02-4179-901a-63f6062b585a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7fee9752-b1fb-4b3b-8c8c-6ede9d753b57 / got description=812c2126-fa02-4179-901a-63f6062b585a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 8056750e-cae9-4e81-aea5-38d1d03584c8 / got description=7fee9752-b1fb-4b3b-8c8c-6ede9d753b57) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6ea44f90-8fdc-4f14-8255-453c5cbd6d4e / got description=8056750e-cae9-4e81-aea5-38d1d03584c8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 93977762-9811-4966-a165-851401fd8011 / got description=6ea44f90-8fdc-4f14-8255-453c5cbd6d4e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 4ec3f389-8aae-4702-ab3e-ca3c93cf8df9 / got description=93977762-9811-4966-a165-851401fd8011) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a92aa87b-b49b-440a-8578-b7007132c492 / got description=4ec3f389-8aae-4702-ab3e-ca3c93cf8df9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 1b97fc9c-570f-4c97-87ad-bca287b0305f / got description=a92aa87b-b49b-440a-8578-b7007132c492) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 606ffb5f-d1f6-47fd-86f8-d4f745651f09 / got description=1b97fc9c-570f-4c97-87ad-bca287b0305f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_sync_through_to_all_4_masters 3.04
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1b74a9ab-e7e7-4284-9f00-e3cfbdf74508 / got description=606ffb5f-d1f6-47fd-86f8-d4f745651f09) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f60040c2-e4ff-4e07-ad7f-bbb6835e6157 / got description=1b74a9ab-e7e7-4284-9f00-e3cfbdf74508) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 84c71482-6113-4d9b-98c6-f8c20f5a15c2 / got description=f60040c2-e4ff-4e07-ad7f-bbb6835e6157) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_modify_some_data_in_m3 8.26
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 661aa664-a4e1-4b7b-b5cc-9eaefec8cf4b / got description=84c71482-6113-4d9b-98c6-f8c20f5a15c2) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fa7d1802-4862-4ae4-8575-0eda06452411 / got description=661aa664-a4e1-4b7b-b5cc-9eaefec8cf4b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect a428524e-bfb3-4511-962b-9d6f693b2e65 / got description=fa7d1802-4862-4ae4-8575-0eda06452411) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_delete_a_few_entries_in_m4 4.15
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect dc377dde-c6d0-4f6f-811f-f45b22610746 / got description=a428524e-bfb3-4511-962b-9d6f693b2e65) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 44d1ed94-1b79-405a-a443-5368beb47750 / got description=dc377dde-c6d0-4f6f-811f-f45b22610746) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a2f612a4-f55d-4811-8970-f4c431736412 / got description=44d1ed94-1b79-405a-a443-5368beb47750) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 81b77402-b266-46e1-8aa5-0686cd5421db / got description=a2f612a4-f55d-4811-8970-f4c431736412) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_replicated_multivalued_entries 2.13
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 046a4509-f6a6-4e90-be33-c672da314473 / got description=81b77402-b266-46e1-8aa5-0686cd5421db) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 046a4509-f6a6-4e90-be33-c672da314473 / got description=81b77402-b266-46e1-8aa5-0686cd5421db) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_bad_replication_agreement 25.53
No log output captured.
Passed suites/fourwaymmr/fourwaymmr_test.py::test_nsds5replicaenabled_verify 69.38
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d55ecdae-e364-4d36-ae5f-7c301ee35f1d / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 76e1d827-84f9-40fe-835b-c22edee82bd7 / got description=046a4509-f6a6-4e90-be33-c672da314473) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 538c6d84-a317-4c9b-a1ba-8a1e9bf26c41 / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6c8fb404-208a-4b41-86b8-bb8424442b5f / got description=76e1d827-84f9-40fe-835b-c22edee82bd7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2500 FAIL: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e334b394-12e5-48a4-85cf-eb4afa0750e0 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f191d1af-64e2-48b6-bd13-93e36817f292 / got description=6c8fb404-208a-4b41-86b8-bb8424442b5f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_create_an_entry_on_the_supplier 2.99
No log output captured.
Passed suites/fourwaymmr/fourwaymmr_test.py::test_bob_acceptance_tests 7.13
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c9274844-a5e9-4eb3-9a0d-59dcfa90ac06 / got description=f191d1af-64e2-48b6-bd13-93e36817f292) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/fourwaymmr/fourwaymmr_test.py::test_replica_backup_and_restore 31.98
------------------------------Captured stderr call------------------------------
ldiffile: /tmp/output_file [30/Oct/2020:21:25:19.231510459 -0400] - INFO - slapd_exemode_ldif2db - Backend Instance: userRoot
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6439747a-3586-46b5-b312-f1d4d399d279 / got description=c9274844-a5e9-4eb3-9a0d-59dcfa90ac06) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f1c5c0fd-9d9e-40fa-b94d-c868ce69520a / got description=6439747a-3586-46b5-b312-f1d4d399d279) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 83ea0bfb-6a88-414c-baa5-180c10fc863c / got description=f1c5c0fd-9d9e-40fa-b94d-c868ce69520a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b369be82-80d9-44f1-b96a-a2d3779c4ff1 / got description=83ea0bfb-6a88-414c-baa5-180c10fc863c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 25865336-19ca-484c-b1e8-7e7aad6a4d52 / got description=b369be82-80d9-44f1-b96a-a2d3779c4ff1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c4dcd83-4c7a-433c-bdd7-1f70ffce7139 / got description=25865336-19ca-484c-b1e8-7e7aad6a4d52) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/fractional/fractional_test.py::test_fractional_agreements 3.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39202, 'ldap-secureport': 63902, 'server-id': 'consumer2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8e3ed97f-8c36-4081-a099-a102039b263b / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3215a747-f535-4710-8420-c0a980aa6d4c / got description=8e3ed97f-8c36-4081-a099-a102039b263b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect fd7e0fde-fa8f-4572-b00f-a97f949d0971 / got description=3215a747-f535-4710-8420-c0a980aa6d4c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:169 Joining consumer consumer2 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect d8da3c81-a4dc-41d3-8f35-71388d9e0293 / got description=fd7e0fde-fa8f-4572-b00f-a97f949d0971) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer2 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 already exists INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer2 from master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is was created
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3c54916e-d1f6-4751-b2aa-b225d6522f27 / got description=d8da3c81-a4dc-41d3-8f35-71388d9e0293) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 7ece08a0-9f1f-4bd4-b585-32c3bb0761be / got description=3c54916e-d1f6-4751-b2aa-b225d6522f27) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect b784a66f-ea91-4585-98a7-3e9af426d7f4 / got description=7ece08a0-9f1f-4bd4-b585-32c3bb0761be) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_read_only_consumer 0.09
No log output captured.
Passed suites/fractional/fractional_test.py::test_read_write_supplier 3.08
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 75fb2248-b423-44b9-aa99-99afdb59267b / got description=b784a66f-ea91-4585-98a7-3e9af426d7f4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect ef9e817d-5193-437f-8fd3-c4fdd3200e88 / got description=75fb2248-b423-44b9-aa99-99afdb59267b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect 16a8037a-0ecd-4202-970c-deb131a6cecc / got description=ef9e817d-5193-437f-8fd3-c4fdd3200e88) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_filtered_attributes 3.08
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 63e7b72d-a71e-4ec6-9ca1-7a8d8d97f4a5 / got description=16a8037a-0ecd-4202-970c-deb131a6cecc) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0f281eff-2d05-4fd4-822f-d661146a63ed / got description=63e7b72d-a71e-4ec6-9ca1-7a8d8d97f4a5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect bd8b3129-b862-48ab-bdfa-e8f5f8b00508 / got description=0f281eff-2d05-4fd4-822f-d661146a63ed) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_fewer_changes_in_single_operation 9.22
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 029d9ddc-3f68-4822-9d65-72dd108fd5c9 / got description=bd8b3129-b862-48ab-bdfa-e8f5f8b00508) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 512c450b-a7c0-4e79-8a0b-9a6fc33fd568 / got description=029d9ddc-3f68-4822-9d65-72dd108fd5c9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect 8ebff87b-fd05-4759-9024-5409e416f73d / got description=512c450b-a7c0-4e79-8a0b-9a6fc33fd568) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5452543e-f416-4f72-992c-fd7cd73e566d / got description=8ebff87b-fd05-4759-9024-5409e416f73d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect bb21d8c1-c064-4b80-a595-c8a714150364 / got description=5452543e-f416-4f72-992c-fd7cd73e566d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect dbac2767-693e-4141-88a8-74006a59ff2c / got description=bb21d8c1-c064-4b80-a595-c8a714150364) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ac725573-16c2-4f7f-abf1-24ac72da9602 / got description=dbac2767-693e-4141-88a8-74006a59ff2c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect ea2d0f27-ca7d-44c4-b8dc-b9eb5b78674e / got description=ac725573-16c2-4f7f-abf1-24ac72da9602) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect 52693777-0912-48bb-a3b1-41550ed15d5f / got description=ea2d0f27-ca7d-44c4-b8dc-b9eb5b78674e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_newly_added_attribute_nsds5replicatedattributelisttotal 6.09
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5ddaa582-af49-47fa-8c1a-743c35db5c01 / got description=52693777-0912-48bb-a3b1-41550ed15d5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5ddaa582-af49-47fa-8c1a-743c35db5c01 / got description=52693777-0912-48bb-a3b1-41550ed15d5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5ddaa582-af49-47fa-8c1a-743c35db5c01 / got description=52693777-0912-48bb-a3b1-41550ed15d5f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5ddaa582-af49-47fa-8c1a-743c35db5c01 / got description=52693777-0912-48bb-a3b1-41550ed15d5f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect e50958ca-de1b-429d-b43d-7078252cc062 / got description=5ddaa582-af49-47fa-8c1a-743c35db5c01) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect 7899c1cd-4a08-444c-8eb5-5ccdd1c05a6f / got description=e50958ca-de1b-429d-b43d-7078252cc062) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_attribute_nsds5replicatedattributelisttotal 21.54
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9281b89e-e4eb-4d2e-8d75-1d74464f2c6b / got description=7899c1cd-4a08-444c-8eb5-5ccdd1c05a6f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 152b47d4-9cc1-4d72-848c-fefb3e955776 / got description=9281b89e-e4eb-4d2e-8d75-1d74464f2c6b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect f1fe677d-61f2-4787-9f86-4180871f727e / got description=152b47d4-9cc1-4d72-848c-fefb3e955776) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/fractional/fractional_test.py::test_implicit_replication_of_password_policy 12.37
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e3515266-8402-45b3-8199-6a5f07427615 / got description=f1fe677d-61f2-4787-9f86-4180871f727e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 809ccabc-f317-4479-bcde-4c88a0c23818 / got description=e3515266-8402-45b3-8199-6a5f07427615) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect c3dee9b6-4e82-453f-90ed-d83301d5abf9 / got description=809ccabc-f317-4479-bcde-4c88a0c23818) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c4f8b65-2009-48bf-8708-b1a7f13a74f7 / got description=c3dee9b6-4e82-453f-90ed-d83301d5abf9) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c4f8b65-2009-48bf-8708-b1a7f13a74f7 / got description=c3dee9b6-4e82-453f-90ed-d83301d5abf9) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c4f8b65-2009-48bf-8708-b1a7f13a74f7 / got description=c3dee9b6-4e82-453f-90ed-d83301d5abf9) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8c4f8b65-2009-48bf-8708-b1a7f13a74f7 / got description=c3dee9b6-4e82-453f-90ed-d83301d5abf9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect e50abe7a-a0ce-4600-a215-16f5ae76e777 / got description=8c4f8b65-2009-48bf-8708-b1a7f13a74f7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect cdbca795-9fbb-4856-aaae-152b70a26b6f / got description=e50abe7a-a0ce-4600-a215-16f5ae76e777) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 20b1a5b6-a049-43f4-88b4-105b4bd3799a / got description=8c4f8b65-2009-48bf-8708-b1a7f13a74f7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect e24cbaed-555c-4c50-83b6-fe27fb9a5208 / got description=20b1a5b6-a049-43f4-88b4-105b4bd3799a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect 8562be7b-621c-4c97-8fd9-ef23393a7517 / got description=e24cbaed-555c-4c50-83b6-fe27fb9a5208) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working
Passed suites/get_effective_rights/acceptance_test.py::test_group_aci_entry_exists 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:30 Adding user testuser
-------------------------------Captured log call--------------------------------
INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:57 Adding group group1 INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:70 Add an ACI granting add access to a user matching the groupdn INFO  lib389:acceptance_test.py:79 dn: uid=testuser,dc=example,dc=com INFO  lib389:acceptance_test.py:81 ######## entryLevelRights: b'vadn'
Passed suites/get_effective_rights/acceptance_test.py::test_group_aci_template_entry 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.get_effective_rights.acceptance_test:acceptance_test.py:105 Add an ACI granting add access to a user matching the userdn INFO  lib389:acceptance_test.py:115 dn: cn=template_person_objectclass,dc=example,dc=com INFO  lib389:acceptance_test.py:117 ######## entryLevelRights: b'vadn' INFO  lib389:acceptance_test.py:120 dn: cn=template_groupofnames_objectclass,dc=example,dc=com INFO  lib389:acceptance_test.py:122 ######## entryLevelRights: b'none'
Passed suites/gssapi/simple_gssapi_test.py::test_invalid_sasl_map 0.50
No log output captured.
Passed suites/gssapi/simple_gssapi_test.py::test_missing_user 1.17
------------------------------Captured stdout call------------------------------
Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Principal "doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM" created. Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. K/M@HOSTED.UPSHIFT.RDU2.REDHAT.COM doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/changepw@HOSTED.UPSHIFT.RDU2.REDHAT.COM kadmin/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM kiprop/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM krbtgt/HOSTED.UPSHIFT.RDU2.REDHAT.COM@HOSTED.UPSHIFT.RDU2.REDHAT.COM ldap/ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com@HOSTED.UPSHIFT.RDU2.REDHAT.COM testuser@HOSTED.UPSHIFT.RDU2.REDHAT.COM Authenticating as principal testuser/admin@HOSTED.UPSHIFT.RDU2.REDHAT.COM with password. Entry for principal doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/doesnotexist.keytab. Entry for principal doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/tmp/doesnotexist.keytab.
------------------------------Captured stderr call------------------------------
No policy specified for doesnotexist@HOSTED.UPSHIFT.RDU2.REDHAT.COM; defaulting to no policy
Passed suites/gssapi/simple_gssapi_test.py::test_rejected_mech 0.62
No log output captured.
Passed suites/gssapi_repl/gssapi_repl_test.py::test_gssapi_repl 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1f5c7be2-125e-4642-a50a-2fe14a765c44 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect e7ffdd5e-bdbd-42d5-b784-efc0f81766ae / got description=1f5c7be2-125e-4642-a50a-2fe14a765c44) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
Passed suites/healthcheck/health_config_test.py::test_healthcheck_logging_format_should_be_revised 0.87
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSCLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: config:hr_timestamp INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- cn=config INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 nsslapd-logging-hr-timestamps-enabled changes the log format in directory server from [07/Jun/2017:17:15:58 +1000] to [07/Jun/2017:17:15:58.716117312 +1000] This actually provides a performance improvement. Additionally, this setting will be removed in a future release. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Set nsslapd-logging-hr-timestamps-enabled to on. You can use 'dsconf' to set this attribute. Here is an example: # dsconf slapd-standalone1 config replace nsslapd-logging-hr-timestamps-enabled=on INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSCLE0001", "severity": "LOW", "description": "Different log timestamp format.", "items": [ "cn=config" ], "detail": "nsslapd-logging-hr-timestamps-enabled changes the log format in directory server from\n\n[07/Jun/2017:17:15:58 +1000]\n\nto\n\n[07/Jun/2017:17:15:58.716117312 +1000]\n\nThis actually provides a performance improvement. Additionally, this setting will be\nremoved in a future release.\n", "fix": "Set nsslapd-logging-hr-timestamps-enabled to on.\nYou can use 'dsconf' to set this attribute. Here is an example:\n\n # dsconf slapd-standalone1 config replace nsslapd-logging-hr-timestamps-enabled=on", "check": "config:hr_timestamp" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_config_test.py::test_healthcheck_RI_plugin_is_misconfigured 0.82
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSRILE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: refint:update_delay INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- cn=referential integrity postoperation,cn=plugins,cn=config INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The referential integrity plugin has an asynchronous processing mode. This is controlled by the update-delay flag. When this value is 0, referential integrity plugin processes these changes inside of the operation that modified the entry - ie these are synchronous. However, when this is > 0, these are performed asynchronously. This leads to only having referint enabled on one master in MMR to prevent replication conflicts and loops. Additionally, because these are performed in the background these updates may cause spurious update delays to your server by batching changes rather than smaller updates during sync processing. We advise that you set this value to 0, and enable referint on all masters as it provides a more predictable behaviour. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Set referint-update-delay to 0. You can use 'dsconf' to set this value. Here is an example: # dsconf slapd-standalone1 plugin referential-integrity set --update-delay 0 You must restart the Directory Server for this change to take effect. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSRILE0001", "severity": "LOW", "description": "Referential integrity plugin may be slower.", "items": [ "cn=referential integrity postoperation,cn=plugins,cn=config" ], "detail": "The referential integrity plugin has an asynchronous processing mode.\nThis is controlled by the update-delay flag. When this value is 0, referential\nintegrity plugin processes these changes inside of the operation that modified\nthe entry - ie these are synchronous.\n\nHowever, when this is > 0, these are performed asynchronously.\n\nThis leads to only having referint enabled on one master in MMR to prevent replication conflicts and loops.\nAdditionally, because these are performed in the background these updates may cause spurious update\ndelays to your server by batching changes rather than smaller updates during sync processing.\n\nWe advise that you set this value to 0, and enable referint on all masters as it provides a more predictable behaviour.\n", "fix": "Set referint-update-delay to 0.\n\nYou can use 'dsconf' to set this value. Here is an example:\n\n # dsconf slapd-standalone1 plugin referential-integrity set --update-delay 0\n\nYou must restart the Directory Server for this change to take effect.", "check": "refint:update_delay" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_config_test.py::test_healthcheck_RI_plugin_missing_indexes 1.25
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSRILE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: refint:attr_indexes INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- cn=referential integrity postoperation,cn=plugins,cn=config INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:52 -- member INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The referential integrity plugin is configured to use an attribute (member) that does not have an "equality" index in backend (dc=example,dc=com). Failure to have the proper indexing will lead to unindexed searches which cause high CPU and can significantly slow the server down. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Check the attributes set in "referint-membership-attr" to make sure they have an index defined that has at least the equality "eq" index type. You will need to reindex the database after adding the missing index type. Here is an example using dsconf: # dsconf slapd-standalone1 backend index add --attr=member --reindex --index-type=eq dc=example,dc=com INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSRILE0002", "severity": "HIGH", "description": "Referential integrity plugin configured with unindexed attribute.", "items": [ "cn=referential integrity postoperation,cn=plugins,cn=config", "dc=example,dc=com", "member" ], "detail": "The referential integrity plugin is configured to use an attribute (member)\nthat does not have an \"equality\" index in backend (dc=example,dc=com).\nFailure to have the proper indexing will lead to unindexed searches which\ncause high CPU and can significantly slow the server down.", "fix": "Check the attributes set in \"referint-membership-attr\" to make sure they have\nan index defined that has at least the equality \"eq\" index type. You will\nneed to reindex the database after adding the missing index type. Here is an\nexample using dsconf:\n\n # dsconf slapd-standalone1 backend index add --attr=member --reindex --index-type=eq dc=example,dc=com\n", "check": "refint:attr_indexes" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_config_test.py::test_healthcheck_virtual_attr_incorrectly_indexed 0.82
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSVIRTLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: backends:userroot:virt_attrs INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Virtual Attributes INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:52 -- Class Of Service (COS) INFO  LogCapture:health.py:52 -- cosAttribute: postalcode INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 You should not index virtual attributes, and as this will break searches that use the attribute in a filter. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Remove the index for this attribute from the backend configuration. Here is an example using 'dsconf' to remove an index: # dsconf slapd-standalone1 backend index delete --attr postalcode dc=example,dc=com INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSVIRTLE0001", "severity": "HIGH", "description": "Virtual attribute indexed.", "items": [ "Virtual Attributes", "dc=example,dc=com", "Class Of Service (COS)", "cosAttribute: postalcode" ], "detail": "You should not index virtual attributes, and as this will break searches that\nuse the attribute in a filter.", "fix": "Remove the index for this attribute from the backend configuration.\nHere is an example using 'dsconf' to remove an index:\n\n # dsconf slapd-standalone1 backend index delete --attr postalcode dc=example,dc=com", "check": "backends:userroot:virt_attrs" } ]
Passed suites/healthcheck/health_config_test.py::test_healthcheck_low_disk_space 0.48
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 2 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSVIRTLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: backends:userroot:virt_attrs INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Virtual Attributes INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:52 -- Class Of Service (COS) INFO  LogCapture:health.py:52 -- cosAttribute: postalcode INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 You should not index virtual attributes, and as this will break searches that use the attribute in a filter. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Remove the index for this attribute from the backend configuration. Here is an example using 'dsconf' to remove an index: # dsconf slapd-standalone1 backend index delete --attr postalcode dc=example,dc=com INFO  LogCapture:health.py:45 [2] DS Lint Error: DSDSLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: monitor-disk-space:disk_space INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Server INFO  LogCapture:health.py:52 -- cn=config INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The disk partition used by the server (/), either for the database, the configuration files, or the logs is over 90% full. If the partition becomes completely filled serious problems can occur with the database or the server's stability. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Attempt to free up disk space. Also try removing old rotated logs, or disable any verbose logging levels that might have been set. You might consider enabling the "Disk Monitoring" feature in cn=config to help prevent a disorderly shutdown of the server: nsslapd-disk-monitoring: on You can use 'dsconf' to set this value. Here is an example: # dsconf slapd-standalone1 config replace nsslapd-disk-monitoring=on You must restart the Directory Server for this change to take effect. Please see the Administration guide for more information: https://access.redhat.com/documentation/en-us/red_hat_directory_server/10/html/administration_guide/diskmonitoring INFO  LogCapture:health.py:124 ===== End Of Report (2 Issues found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSVIRTLE0001", "severity": "HIGH", "description": "Virtual attribute indexed.", "items": [ "Virtual Attributes", "dc=example,dc=com", "Class Of Service (COS)", "cosAttribute: postalcode" ], "detail": "You should not index virtual attributes, and as this will break searches that\nuse the attribute in a filter.", "fix": "Remove the index for this attribute from the backend configuration.\nHere is an example using 'dsconf' to remove an index:\n\n # dsconf slapd-standalone1 backend index delete --attr postalcode dc=example,dc=com", "check": "backends:userroot:virt_attrs" }, { "dsle": "DSDSLE0001", "severity": "HIGH", "description": "Low disk space.", "items": [ "Server", "cn=config" ], "detail": "The disk partition used by the server (/), either for the database, the\nconfiguration files, or the logs is over 90% full. If the partition becomes\ncompletely filled serious problems can occur with the database or the server's\nstability.", "fix": "Attempt to free up disk space. Also try removing old rotated logs, or disable any\nverbose logging levels that might have been set. You might consider enabling\nthe \"Disk Monitoring\" feature in cn=config to help prevent a disorderly shutdown\nof the server:\n\n nsslapd-disk-monitoring: on\n\nYou can use 'dsconf' to set this value. Here is an example:\n\n # dsconf slapd-standalone1 config replace nsslapd-disk-monitoring=on\n\nYou must restart the Directory Server for this change to take effect.\n\nPlease see the Administration guide for more information:\n\n https://access.redhat.com/documentation/en-us/red_hat_directory_server/10/html/administration_guide/diskmonitoring\n", "check": "monitor-disk-space:disk_space" } ]
Passed suites/healthcheck/health_config_test.py::test_healthcheck_notes_unindexed_search 12.89
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSLOGNOTES0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: Medium INFO  LogCapture:health.py:49 Check: logs:notes INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Performance INFO  LogCapture:health.py:52 -- /var/log/dirsrv/slapd-standalone1/access INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Found 1 fully unindexed searches in the current access log. Unindexed searches can cause high CPU and slow down the entire server's performance. [1] Unindexed Search - date: . - conn/op: 1/2 - base: dc=example,dc=com - scope: 2 - filter: (&(|(objectClass=nsAccount)(objectClass=nsPerson)(objectClass=simpleSecurityObject)(objectClass=organization)(objectClass=person)(objectClass=account)(objectClass=organizationalUnit)(objectClass=netscapeServer)(objectClass=domain)(objectClass=posixAccount)(objectClass=shadowAccount)(objectClass=posixGroup)(objectClass=mailRecipient))(uid=test*)) - etime: 0.371460060 INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Examine the searches that are unindexed, and either properly index the attributes in the filter, increase the nsslapd-idlistscanlimit, or stop using that filter. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSLOGNOTES0001", "severity": "Medium", "description": "Unindexed Search", "items": [ "Performance", "/var/log/dirsrv/slapd-standalone1/access" ], "detail": "Found 1 fully unindexed searches in the current access log.\nUnindexed searches can cause high CPU and slow down the entire server's performance.\n\n [1] Unindexed Search\n - date: .\n - conn/op: 1/2\n - base: dc=example,dc=com\n - scope: 2\n - filter: (&(|(objectClass=nsAccount)(objectClass=nsPerson)(objectClass=simpleSecurityObject)(objectClass=organization)(objectClass=person)(objectClass=account)(objectClass=organizationalUnit)(objectClass=netscapeServer)(objectClass=domain)(objectClass=posixAccount)(objectClass=shadowAccount)(objectClass=posixGroup)(objectClass=mailRecipient))(uid=test*))\n - etime: 0.371460060\n", "fix": "Examine the searches that are unindexed, and either properly index the attributes\nin the filter, increase the nsslapd-idlistscanlimit, or stop using that filter.", "check": "logs:notes" } ]
Passed suites/healthcheck/health_config_test.py::test_healthcheck_notes_unknown_attribute 11.31
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSLOGNOTES0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: Medium INFO  LogCapture:health.py:49 Check: logs:notes INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Possible Performance Impact INFO  LogCapture:health.py:52 -- /var/log/dirsrv/slapd-standalone1/access INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Found 1 searches in the current access log that are using an unknown attribute in the search filter. [1] Invalid Attribute in Filter - date: . - conn/op: 1/2 - filter: (&(|(objectClass=nsAccount)(objectClass=nsPerson)(objectClass=simpleSecurityObject)(objectClass=organization)(objectClass=person)(objectClass=account)(objectClass=organizationalUnit)(objectClass=netscapeServer)(objectClass=domain)(objectClass=posixAccount)(objectClass=shadowAccount)(objectClass=posixGroup)(objectClass=mailRecipient))(unknown=test)) INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Stop using this these unknown attributes in the filter, or add the schema to the server and make sure it's properly indexed. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSLOGNOTES0002", "severity": "Medium", "description": "Unknown Attribute In Filter", "items": [ "Possible Performance Impact", "/var/log/dirsrv/slapd-standalone1/access" ], "detail": "Found 1 searches in the current access log that are using an\nunknown attribute in the search filter.\n\n [1] Invalid Attribute in Filter\n - date: .\n - conn/op: 1/2\n - filter: (&(|(objectClass=nsAccount)(objectClass=nsPerson)(objectClass=simpleSecurityObject)(objectClass=organization)(objectClass=person)(objectClass=account)(objectClass=organizationalUnit)(objectClass=netscapeServer)(objectClass=domain)(objectClass=posixAccount)(objectClass=shadowAccount)(objectClass=posixGroup)(objectClass=mailRecipient))(unknown=test))\n", "fix": "Stop using this these unknown attributes in the filter, or add the schema\nto the server and make sure it's properly indexed.", "check": "logs:notes" } ]
Passed suites/healthcheck/health_repl_test.py::test_healthcheck_replication_replica_not_reachable 1.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5e65d634-755d-49b8-9c39-783d53ea2225 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 68e2195c-a671-4e15-89d5-7727e76c8de3 / got description=5e65d634-755d-49b8-9c39-783d53ea2225) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7f7af088-803a-4eba-84ae-753f1b386baf / got description=68e2195c-a671-4e15-89d5-7727e76c8de3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSREPLLE0005 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: replication:agmts_status INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Agreement INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication agreement (002) under "dc=example,dc=com" is not in synchronization, because the consumer server is not reachable. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Check if the consumer is running, and also check the errors log for more information. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSREPLLE0005", "severity": "MEDIUM", "description": "Replication consumer not reachable.", "items": [ "Replication", "Agreement" ], "detail": "The replication agreement (002) under \"dc=example,dc=com\" is not in synchronization,\nbecause the consumer server is not reachable.", "fix": "Check if the consumer is running, and also check the errors log for more information.", "check": "replication:agmts_status" } ] INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_repl_test.py::test_healthcheck_changelog_trimming_not_configured 3.76
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSCLLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userroot::cl_trimming INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Changelog INFO  LogCapture:health.py:52 -- Backends INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication changelog does have any kind of trimming configured. This will lead to the changelog size growing indefinitely. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Configure changelog trimming, preferably by setting the maximum age of a changelog record. Here is an example: # dsconf slapd-master1 replication set-changelog --suffix YOUR_SUFFIX --max-age 30d INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSCLLE0001", "severity": "LOW", "description": "Changelog trimming not configured.", "items": [ "Replication", "Changelog", "Backends" ], "detail": "The replication changelog does have any kind of trimming configured. This will\nlead to the changelog size growing indefinitely.", "fix": "Configure changelog trimming, preferably by setting the maximum age of a changelog\nrecord. Here is an example:\n\n # dsconf slapd-master1 replication set-changelog --suffix YOUR_SUFFIX --max-age 30d", "check": "backends:userroot::cl_trimming" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_repl_test.py::test_healthcheck_replication_presence_of_conflict_entries 3.73
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect abeb3915-efc4-4010-9919-6f2f96598b68 / got description=3152cc8b-afe5-4d05-b87d-ea9a4ed84ad1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3b7b6290-1eee-4ba0-895d-fd7aa303d55d / got description=abeb3915-efc4-4010-9919-6f2f96598b68) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 402503ad-b4a1-4e07-b6d9-2619141b16d9 / got description=3b7b6290-1eee-4ba0-895d-fd7aa303d55d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSREPLLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: replication:conflicts INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Conflict Entries INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 There were 1 conflict entries found under the replication suffix "dc=example,dc=com". INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 While conflict entries are expected to occur in an MMR environment, they should be resolved. In regards to conflict entries there is always the original/counterpart entry that has a normal DN, and then the conflict version of that entry. Technically both entries are valid, you as the administrator, needs to decide which entry you want to keep. First examine/compare both entries to determine which one you want to keep or remove. You can use the CLI tool "dsconf" to resolve the conflict. Here is an example: List the conflict entries: # dsconf slapd-master1 repl-conflict list dc=example,dc=com Examine conflict entry and its counterpart entry: # dsconf slapd-master1 repl-conflict compare <DN of conflict entry> Remove conflict entry and keep only the original/counterpart entry: # dsconf slapd-master1 repl-conflict delete <DN of conflict entry> Replace the original/counterpart entry with the conflict entry: # dsconf slapd-master1 repl-conflict swap <DN of conflict entry> INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSREPLLE0002", "severity": "LOW", "description": "Replication conflict entries found.", "items": [ "Replication", "Conflict Entries" ], "detail": "There were 1 conflict entries found under the replication suffix \"dc=example,dc=com\".", "fix": "While conflict entries are expected to occur in an MMR environment, they\nshould be resolved. In regards to conflict entries there is always the original/counterpart\nentry that has a normal DN, and then the conflict version of that entry. Technically both\nentries are valid, you as the administrator, needs to decide which entry you want to keep.\nFirst examine/compare both entries to determine which one you want to keep or remove. You\ncan use the CLI tool \"dsconf\" to resolve the conflict. Here is an example:\n\n List the conflict entries:\n\n # dsconf slapd-master1 repl-conflict list dc=example,dc=com\n\n Examine conflict entry and its counterpart entry:\n\n # dsconf slapd-master1 repl-conflict compare <DN of conflict entry>\n\n Remove conflict entry and keep only the original/counterpart entry:\n\n # dsconf slapd-master1 repl-conflict delete <DN of conflict entry>\n\n Replace the original/counterpart entry with the conflict entry:\n\n # dsconf slapd-master1 repl-conflict swap <DN of conflict entry>\n", "check": "replication:conflicts" } ]
Passed suites/healthcheck/health_repl_test.py::test_healthcheck_replication_out_of_sync_broken 0.96
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 58a347ec-fd84-4d86-85c3-833f56234d80 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect c3c10350-8c5a-43be-8ca9-e00d545baf26 / got description=58a347ec-fd84-4d86-85c3-833f56234d80) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect b92d07c7-6664-4f37-9020-29e9384d0dad / got description=c3c10350-8c5a-43be-8ca9-e00d545baf26) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 402e7feb-be47-493c-a15f-c3d8b81a4b55 / got description=b92d07c7-6664-4f37-9020-29e9384d0dad) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 3 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSREPLLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: replication:agmts_status INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Agreement INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication agreement (002) under "dc=example,dc=com" is not in synchronization. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You may need to reinitialize this replication agreement. Please check the errors log for more information. If you do need to reinitialize the agreement you can do so using dsconf. Here is an example: # dsconf slapd-master1 repl-agmt init "002" --suffix dc=example,dc=com INFO  LogCapture:health.py:45 [2] DS Lint Error: DSREPLLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: replication:agmts_status INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Agreement INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication agreement (003) under "dc=example,dc=com" is not in synchronization. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You may need to reinitialize this replication agreement. Please check the errors log for more information. If you do need to reinitialize the agreement you can do so using dsconf. Here is an example: # dsconf slapd-master1 repl-agmt init "003" --suffix dc=example,dc=com INFO  LogCapture:health.py:45 [3] DS Lint Error: DSCLLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userroot::cl_trimming INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Changelog INFO  LogCapture:health.py:52 -- Backends INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication changelog does have any kind of trimming configured. This will lead to the changelog size growing indefinitely. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Configure changelog trimming, preferably by setting the maximum age of a changelog record. Here is an example: # dsconf slapd-master1 replication set-changelog --suffix YOUR_SUFFIX --max-age 30d INFO  LogCapture:health.py:124 ===== End Of Report (3 Issues found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSREPLLE0001", "severity": "HIGH", "description": "Replication agreement not set to be synchronized.", "items": [ "Replication", "Agreement" ], "detail": "The replication agreement (002) under \"dc=example,dc=com\" is not in synchronization.", "fix": "You may need to reinitialize this replication agreement. Please check the errors\nlog for more information. If you do need to reinitialize the agreement you can do so\nusing dsconf. Here is an example:\n\n # dsconf slapd-master1 repl-agmt init \"002\" --suffix dc=example,dc=com", "check": "replication:agmts_status" }, { "dsle": "DSREPLLE0001", "severity": "HIGH", "description": "Replication agreement not set to be synchronized.", "items": [ "Replication", "Agreement" ], "detail": "The replication agreement (003) under \"dc=example,dc=com\" is not in synchronization.", "fix": "You may need to reinitialize this replication agreement. Please check the errors\nlog for more information. If you do need to reinitialize the agreement you can do so\nusing dsconf. Here is an example:\n\n # dsconf slapd-master1 repl-agmt init \"003\" --suffix dc=example,dc=com", "check": "replication:agmts_status" }, { "dsle": "DSCLLE0001", "severity": "LOW", "description": "Changelog trimming not configured.", "items": [ "Replication", "Changelog", "Backends" ], "detail": "The replication changelog does have any kind of trimming configured. This will\nlead to the changelog size growing indefinitely.", "fix": "Configure changelog trimming, preferably by setting the maximum age of a changelog\nrecord. Here is an example:\n\n # dsconf slapd-master1 replication set-changelog --suffix YOUR_SUFFIX --max-age 30d", "check": "backends:userroot::cl_trimming" } ]
Passed suites/healthcheck/health_security_test.py::test_healthcheck_insecure_pwd_hash_configured 0.68
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSCLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: config:passwordscheme INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- cn=config INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Password storage schemes in Directory Server define how passwords are hashed via a one-way mathematical function for storage. Knowing the hash it is difficult to gain the input, but knowing the input you can easily compare the hash. Many hashes are well known for cryptograhpic verification properties, but are designed to be *fast* to validate. This is the opposite of what we desire for password storage. In the unlikely event of a disclosure, you want hashes to be *difficult* to verify, as this adds a cost of work to an attacker. In Directory Server, we offer one hash suitable for this (PBKDF2_SHA256) and one hash for "legacy" support (SSHA512). Your configuration does not use these for password storage or the root password storage scheme. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Perform a configuration reset of the values: passwordStorageScheme nsslapd-rootpwstoragescheme IE, stop Directory Server, and in dse.ldif delete these two lines. When Directory Server is started, they will use the server provided defaults that are secure. You can also use 'dsconf' to replace these values. Here is an example: # dsconf slapd-standalone1 config replace passwordStorageScheme=PBKDF2_SHA256 nsslapd-rootpwstoragescheme=PBKDF2_SHA256 INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSCLE0002", "severity": "HIGH", "description": "Weak passwordStorageScheme.", "items": [ "cn=config" ], "detail": "Password storage schemes in Directory Server define how passwords are hashed via a\none-way mathematical function for storage. Knowing the hash it is difficult to gain\nthe input, but knowing the input you can easily compare the hash.\n\nMany hashes are well known for cryptograhpic verification properties, but are\ndesigned to be *fast* to validate. This is the opposite of what we desire for password\nstorage. In the unlikely event of a disclosure, you want hashes to be *difficult* to\nverify, as this adds a cost of work to an attacker.\n\nIn Directory Server, we offer one hash suitable for this (PBKDF2_SHA256) and one hash\nfor \"legacy\" support (SSHA512).\n\nYour configuration does not use these for password storage or the root password storage\nscheme.\n", "fix": "Perform a configuration reset of the values:\n\npasswordStorageScheme\nnsslapd-rootpwstoragescheme\n\nIE, stop Directory Server, and in dse.ldif delete these two lines. When Directory Server\nis started, they will use the server provided defaults that are secure.\n\nYou can also use 'dsconf' to replace these values. Here is an example:\n\n # dsconf slapd-standalone1 config replace passwordStorageScheme=PBKDF2_SHA256 nsslapd-rootpwstoragescheme=PBKDF2_SHA256", "check": "config:passwordscheme" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_security_test.py::test_healthcheck_min_allowed_tls_version_too_low 23.58
------------------------------Captured stdout call------------------------------
Setting system policy to LEGACY Note: System-wide crypto policies are applied on application start-up. It is recommended to restart the system for the change of policies to fully take place.
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSELE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: encryption:check_tls_version INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- cn=encryption,cn=config INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 This Directory Server may not be using strong TLS protocol versions. TLS1.0 is known to have a number of issues with the protocol. Please see: https://tools.ietf.org/html/rfc7457 It is advised you set this value to the maximum possible. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 There are two options for setting the TLS minimum version allowed. You, can set "sslVersionMin" in "cn=encryption,cn=config" to a version greater than "TLS1.0" You can also use 'dsconf' to set this value. Here is an example: # dsconf slapd-standalone1 security set --tls-protocol-min=TLS1.2 You must restart the Directory Server for this change to take effect. Or, you can set the system wide crypto policy to FUTURE which will use a higher TLS minimum version, but doing this affects the entire system: # update-crypto-policies --set FUTURE INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSELE0001", "severity": "MEDIUM", "description": "Weak TLS protocol version.", "items": [ "cn=encryption,cn=config" ], "detail": "This Directory Server may not be using strong TLS protocol versions. TLS1.0 is known to\nhave a number of issues with the protocol. Please see:\n\nhttps://tools.ietf.org/html/rfc7457\n\nIt is advised you set this value to the maximum possible.", "fix": "There are two options for setting the TLS minimum version allowed. You,\ncan set \"sslVersionMin\" in \"cn=encryption,cn=config\" to a version greater than \"TLS1.0\"\nYou can also use 'dsconf' to set this value. Here is an example:\n\n # dsconf slapd-standalone1 security set --tls-protocol-min=TLS1.2\n\nYou must restart the Directory Server for this change to take effect.\n\nOr, you can set the system wide crypto policy to FUTURE which will use a higher TLS\nminimum version, but doing this affects the entire system:\n\n # update-crypto-policies --set FUTURE", "check": "encryption:check_tls_version" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_security_test.py::test_healthcheck_resolvconf_bad_file_perm 1.27
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSPERMLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: fschecks:file_perms INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- File Permissions INFO  LogCapture:health.py:52 -- /etc/resolv.conf INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The file "/etc/resolv.conf" does not have the expected permissions (644). This can cause issues with replication and chaining. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Change the file permissions: # chmod 644 /etc/resolv.conf INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSPERMLE0001", "severity": "MEDIUM", "description": "Incorrect file permissions.", "items": [ "File Permissions", "/etc/resolv.conf" ], "detail": "The file \"/etc/resolv.conf\" does not have the expected permissions (644). This\ncan cause issues with replication and chaining.", "fix": "Change the file permissions:\n\n # chmod 644 /etc/resolv.conf", "check": "fschecks:file_perms" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_security_test.py::test_healthcheck_pwdfile_bad_file_perm 1.57
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSPERMLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: fschecks:file_perms INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- File Permissions INFO  LogCapture:health.py:52 -- /etc/dirsrv/slapd-standalone1/pwdfile.txt INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The file "/etc/dirsrv/slapd-standalone1/pwdfile.txt" does not have the expected permissions (400). The security database pin/password files should only be readable by Directory Server user. INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Change the file permissions: # chmod 400 /etc/dirsrv/slapd-standalone1/pwdfile.txt INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSPERMLE0002", "severity": "HIGH", "description": "Incorrect security database file permissions.", "items": [ "File Permissions", "/etc/dirsrv/slapd-standalone1/pwdfile.txt" ], "detail": "The file \"/etc/dirsrv/slapd-standalone1/pwdfile.txt\" does not have the expected permissions (400). The\nsecurity database pin/password files should only be readable by Directory Server user.", "fix": "Change the file permissions:\n\n # chmod 400 /etc/dirsrv/slapd-standalone1/pwdfile.txt", "check": "fschecks:file_perms" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/health_sync_test.py::test_healthcheck_replication_out_of_sync_not_broken 38.22
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b307c292-5026-4d55-a7d8-a7a53aaedf56 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 1787238d-6df4-47a3-93dc-2b490e759be1 / got description=b307c292-5026-4d55-a7d8-a7a53aaedf56) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect cd02c1d2-302b-495c-950a-7664d4edeb9f / got description=1787238d-6df4-47a3-93dc-2b490e759be1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6a5da66b-2df0-4725-9963-8a2fdd5c6721 / got description=cd02c1d2-302b-495c-950a-7664d4edeb9f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6a5da66b-2df0-4725-9963-8a2fdd5c6721 / got description=cd02c1d2-302b-495c-950a-7664d4edeb9f) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6a5da66b-2df0-4725-9963-8a2fdd5c6721 / got description=cd02c1d2-302b-495c-950a-7664d4edeb9f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSREPLLE0003 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: replication:agmts_status INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- Replication INFO  LogCapture:health.py:52 -- Agreement INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The replication agreement (001) under "dc=example,dc=com" is not in synchronization. Status message: error (1) can't acquire busy replica (unable to acquire replica: the replica is currently being updated by another supplier.) INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Replication is not in synchronization but it may recover. Continue to monitor this agreement. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSREPLLE0003", "severity": "MEDIUM", "description": "Unsynchronized replication agreement.", "items": [ "Replication", "Agreement" ], "detail": "The replication agreement (001) under \"dc=example,dc=com\" is not in synchronization.\nStatus message: error (1) can't acquire busy replica (unable to acquire replica: the replica is currently being updated by another supplier.)", "fix": "Replication is not in synchronization but it may recover. Continue to\nmonitor this agreement.", "check": "replication:agmts_status" } ]
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_disabled_suffix 0.90
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 2 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSBLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: backends:userroot:mappingtree INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- userroot INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 This backend may be missing the correct mapping tree references. Mapping Trees allow the directory server to determine which backend an operation is routed to in the abscence of other information. This is extremely important for correct functioning of LDAP ADD for example. A correct Mapping tree for this backend must contain the suffix name, the database name and be a backend type. IE: cn=o3Dexample,cn=mapping tree,cn=config cn: o=example nsslapd-backend: userRoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Either you need to create the mapping tree, or you need to repair the related mapping tree. You will need to do this by hand by editing cn=config, or stopping the instance and editing dse.ldif. INFO  LogCapture:health.py:45 [2] DS Lint Error: DSBLE0002 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: HIGH INFO  LogCapture:health.py:49 Check: backends:userroot:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\n'}) INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Check the server's error and access logs for more information. INFO  LogCapture:health.py:124 ===== End Of Report (2 Issues found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSBLE0001", "severity": "MEDIUM", "description": "Possibly incorrect mapping tree.", "items": [ "userroot" ], "detail": "This backend may be missing the correct mapping tree references. Mapping Trees allow\nthe directory server to determine which backend an operation is routed to in the\nabscence of other information. This is extremely important for correct functioning\nof LDAP ADD for example.\n\nA correct Mapping tree for this backend must contain the suffix name, the database name\nand be a backend type. IE:\n\ncn=o3Dexample,cn=mapping tree,cn=config\ncn: o=example\nnsslapd-backend: userRoot\nnsslapd-state: backend\nobjectClass: top\nobjectClass: extensibleObject\nobjectClass: nsMappingTree\n\n", "fix": "Either you need to create the mapping tree, or you need to repair the related\nmapping tree. You will need to do this by hand by editing cn=config, or stopping\nthe instance and editing dse.ldif.\n", "check": "backends:userroot:mappingtree" }, { "dsle": "DSBLE0002", "severity": "HIGH", "description": "Unable to query backend.", "items": [ "dc=example,dc=com" ], "detail": "Unable to query the backend. LDAP error ({'msgtype': 101, 'msgid': 26, 'result': 1, 'desc': 'Operations error', 'ctrls': [], 'info': 'Warning: Operation attempted on a disabled node : dc=example,dc=com\\n'})", "fix": "Check the server's error and access logs for more information.", "check": "backends:userroot:search" } ]
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_standalone 0.23
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_list_checks 0.04
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:90 config:hr_timestamp INFO  LogCapture:health.py:90 config:passwordscheme INFO  LogCapture:health.py:90 backends:userroot:cl_trimming INFO  LogCapture:health.py:90 backends:userroot:mappingtree INFO  LogCapture:health.py:90 backends:userroot:search INFO  LogCapture:health.py:90 backends:userroot:virt_attrs INFO  LogCapture:health.py:90 encryption:check_tls_version INFO  LogCapture:health.py:90 fschecks:file_perms INFO  LogCapture:health.py:90 refint:attr_indexes INFO  LogCapture:health.py:90 refint:update_delay INFO  LogCapture:health.py:90 monitor-disk-space:disk_space INFO  LogCapture:health.py:90 replication:agmts_status INFO  LogCapture:health.py:90 replication:conflicts INFO  LogCapture:health.py:90 dseldif:nsstate INFO  LogCapture:health.py:90 tls:certificate_expiration INFO  LogCapture:health.py:90 logs:notes
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_list_errors 0.00
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:71 DSBLE0001 :: Possibly incorrect mapping tree. INFO  LogCapture:health.py:71 DSBLE0002 :: Unable to query backend. INFO  LogCapture:health.py:71 DSBLE0003 :: Uninitialized backend database. INFO  LogCapture:health.py:71 DSCERTLE0001 :: Certificate about to expire. INFO  LogCapture:health.py:71 DSCERTLE0002 :: Certificate expired. INFO  LogCapture:health.py:71 DSCLE0001 :: Different log timestamp format. INFO  LogCapture:health.py:71 DSCLE0002 :: Weak passwordStorageScheme. INFO  LogCapture:health.py:71 DSCLLE0001 :: Changelog trimming not configured. INFO  LogCapture:health.py:71 DSDSLE0001 :: Low disk space. INFO  LogCapture:health.py:71 DSELE0001 :: Weak TLS protocol version. INFO  LogCapture:health.py:71 DSLOGNOTES0001 :: Unindexed Search INFO  LogCapture:health.py:71 DSLOGNOTES0002 :: Unknown Attribute In Filter INFO  LogCapture:health.py:71 DSPERMLE0001 :: Incorrect file permissions. INFO  LogCapture:health.py:71 DSPERMLE0002 :: Incorrect security database file permissions. INFO  LogCapture:health.py:71 DSREPLLE0001 :: Replication agreement not set to be synchronized. INFO  LogCapture:health.py:71 DSREPLLE0002 :: Replication conflict entries found. INFO  LogCapture:health.py:71 DSREPLLE0003 :: Unsynchronized replication agreement. INFO  LogCapture:health.py:71 DSREPLLE0004 :: Unable to get replication agreement status. INFO  LogCapture:health.py:71 DSREPLLE0005 :: Replication consumer not reachable. INFO  LogCapture:health.py:71 DSRILE0001 :: Referential integrity plugin may be slower. INFO  LogCapture:health.py:71 DSRILE0002 :: Referential integrity plugin configured with unindexed attribute. INFO  LogCapture:health.py:71 DSSKEWLE0001 :: Medium time skew. INFO  LogCapture:health.py:71 DSSKEWLE0002 :: Major time skew. INFO  LogCapture:health.py:71 DSSKEWLE0003 :: Extensive time skew. INFO  LogCapture:health.py:71 DSVIRTLE0001 :: Virtual attribute indexed.
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_check_option 2.55
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_standalone_tls 10.21
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_replication 4.89
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect acf58bf1-2639-4188-ab4a-aeb76d07a2fd / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 4cafa5d6-227f-4fdd-8492-1f58b3a6cd52 / got description=acf58bf1-2639-4188-ab4a-aeb76d07a2fd) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_replication_tls 21.23
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 [] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_backend_missing_mapping_tree 1.78
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 2 Issues found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSBLE0001 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: MEDIUM INFO  LogCapture:health.py:49 Check: backends:userroot:mappingtree INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- userroot INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 This backend may be missing the correct mapping tree references. Mapping Trees allow the directory server to determine which backend an operation is routed to in the abscence of other information. This is extremely important for correct functioning of LDAP ADD for example. A correct Mapping tree for this backend must contain the suffix name, the database name and be a backend type. IE: cn=o3Dexample,cn=mapping tree,cn=config cn: o=example nsslapd-backend: userRoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 Either you need to create the mapping tree, or you need to repair the related mapping tree. You will need to do this by hand by editing cn=config, or stopping the instance and editing dse.ldif. INFO  LogCapture:health.py:45 [2] DS Lint Error: DSBLE0003 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userroot:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The backend database has not been initialized yet INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You need to import an LDIF file, or create the suffix entry, in order to initialize the database. INFO  LogCapture:health.py:124 ===== End Of Report (2 Issues found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSBLE0001", "severity": "MEDIUM", "description": "Possibly incorrect mapping tree.", "items": [ "userroot" ], "detail": "This backend may be missing the correct mapping tree references. Mapping Trees allow\nthe directory server to determine which backend an operation is routed to in the\nabscence of other information. This is extremely important for correct functioning\nof LDAP ADD for example.\n\nA correct Mapping tree for this backend must contain the suffix name, the database name\nand be a backend type. IE:\n\ncn=o3Dexample,cn=mapping tree,cn=config\ncn: o=example\nnsslapd-backend: userRoot\nnsslapd-state: backend\nobjectClass: top\nobjectClass: extensibleObject\nobjectClass: nsMappingTree\n\n", "fix": "Either you need to create the mapping tree, or you need to repair the related\nmapping tree. You will need to do this by hand by editing cn=config, or stopping\nthe instance and editing dse.ldif.\n", "check": "backends:userroot:mappingtree" }, { "dsle": "DSBLE0003", "severity": "LOW", "description": "Uninitialized backend database.", "items": [ "dc=example,dc=com" ], "detail": "The backend database has not been initialized yet", "fix": "You need to import an LDIF file, or create the suffix entry, in order to initialize the database.", "check": "backends:userroot:search" } ] INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:111 No issues found. INFO  LogCapture:health.py:113 []
Passed suites/healthcheck/healthcheck_test.py::test_healthcheck_database_not_initialized 0.47
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1'} was created.
-------------------------------Captured log call--------------------------------
INFO  LogCapture:health.py:94 Beginning lint report, this could take a while ... INFO  LogCapture:health.py:99 Checking config:hr_timestamp ... INFO  LogCapture:health.py:99 Checking config:passwordscheme ... INFO  LogCapture:health.py:99 Checking backends:userroot:cl_trimming ... INFO  LogCapture:health.py:99 Checking backends:userroot:mappingtree ... INFO  LogCapture:health.py:99 Checking backends:userroot:search ... INFO  LogCapture:health.py:99 Checking backends:userroot:virt_attrs ... INFO  LogCapture:health.py:99 Checking encryption:check_tls_version ... INFO  LogCapture:health.py:99 Checking fschecks:file_perms ... INFO  LogCapture:health.py:99 Checking refint:attr_indexes ... INFO  LogCapture:health.py:99 Checking refint:update_delay ... INFO  LogCapture:health.py:99 Checking monitor-disk-space:disk_space ... INFO  LogCapture:health.py:99 Checking replication:agmts_status ... INFO  LogCapture:health.py:99 Checking replication:conflicts ... INFO  LogCapture:health.py:99 Checking dseldif:nsstate ... INFO  LogCapture:health.py:99 Checking tls:certificate_expiration ... INFO  LogCapture:health.py:99 Checking logs:notes ... INFO  LogCapture:health.py:106 Healthcheck complete. INFO  LogCapture:health.py:119 1 Issue found! Generating report ... INFO  LogCapture:health.py:45 [1] DS Lint Error: DSBLE0003 INFO  LogCapture:health.py:46 -------------------------------------------------------------------------------- INFO  LogCapture:health.py:47 Severity: LOW INFO  LogCapture:health.py:49 Check: backends:userroot:search INFO  LogCapture:health.py:50 Affects: INFO  LogCapture:health.py:52 -- dc=example,dc=com INFO  LogCapture:health.py:53 Details: INFO  LogCapture:health.py:54 ----------- INFO  LogCapture:health.py:55 The backend database has not been initialized yet INFO  LogCapture:health.py:56 Resolution: INFO  LogCapture:health.py:57 ----------- INFO  LogCapture:health.py:58 You need to import an LDIF file, or create the suffix entry, in order to initialize the database. INFO  LogCapture:health.py:124 ===== End Of Report (1 Issue found) ===== INFO  LogCapture:health.py:126 [ { "dsle": "DSBLE0003", "severity": "LOW", "description": "Uninitialized backend database.", "items": [ "dc=example,dc=com" ], "detail": "The backend database has not been initialized yet", "fix": "You need to import an LDIF file, or create the suffix entry, in order to initialize the database.", "check": "backends:userroot:search" } ]
Passed suites/import/import_test.py::test_import_with_index 8.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/import/import_test.py::test_crash_on_ldif2db 4.98
No log output captured.
Passed suites/import/import_test.py::test_ldif2db_allows_entries_without_a_parent_to_be_imported 5.50
-------------------------------Captured log call--------------------------------
ERROR  lib389:__init__.py:2647 ldif2db: Can't find file: /var/lib/dirsrv/slapd-standalone1/ldif/bogus.ldif
Passed suites/import/import_test.py::test_issue_a_warning_if_the_cache_size_is_smaller 7.36
No log output captured.
Passed suites/import/import_test.py::test_entry_with_escaped_characters_fails_to_import_and_index 12.88
------------------------------Captured stderr call------------------------------
[30/Oct/2020:21:36:58.524080353 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [30/Oct/2020:21:36:58.534872979 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7504166912, process usage 22921216 [30/Oct/2020:21:36:58.539200561 -0400] - INFO - check_and_set_import_cache - Import allocates 16384KB import cache. [30/Oct/2020:21:36:58.544563372 -0400] - INFO - bdb_copy_directory - Backing up file 0 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/givenName.db) [30/Oct/2020:21:36:58.548885984 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/givenName.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/givenName.db [30/Oct/2020:21:36:58.553627915 -0400] - INFO - bdb_copy_directory - Backing up file 1 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/aci.db) [30/Oct/2020:21:36:58.558899484 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/aci.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/aci.db [30/Oct/2020:21:36:58.563091246 -0400] - INFO - bdb_copy_directory - Backing up file 2 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/sn.db) [30/Oct/2020:21:36:58.567057553 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/sn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/sn.db [30/Oct/2020:21:36:58.570884447 -0400] - INFO - bdb_copy_directory - Backing up file 3 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/numsubordinates.db) [30/Oct/2020:21:36:58.574282737 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/numsubordinates.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/numsubordinates.db [30/Oct/2020:21:36:58.578190063 -0400] - INFO - bdb_copy_directory - Backing up file 4 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/nsuniqueid.db) [30/Oct/2020:21:36:58.581545795 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/nsuniqueid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/nsuniqueid.db [30/Oct/2020:21:36:58.585109243 -0400] - INFO - bdb_copy_directory - Backing up file 5 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/ancestorid.db) [30/Oct/2020:21:36:58.589257990 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/ancestorid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/ancestorid.db [30/Oct/2020:21:36:58.592619110 -0400] - INFO - bdb_copy_directory - Backing up file 6 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/parentid.db) [30/Oct/2020:21:36:58.596070140 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/parentid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/parentid.db [30/Oct/2020:21:36:58.599792658 -0400] - INFO - bdb_copy_directory - Backing up file 7 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/mail.db) [30/Oct/2020:21:36:58.603912482 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/mail.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/mail.db [30/Oct/2020:21:36:58.607903302 -0400] - INFO - bdb_copy_directory - Backing up file 8 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/telephoneNumber.db) [30/Oct/2020:21:36:58.611415484 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/telephoneNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/telephoneNumber.db [30/Oct/2020:21:36:58.614986935 -0400] - INFO - bdb_copy_directory - Backing up file 9 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/cn.db) [30/Oct/2020:21:36:58.620532307 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/cn.db [30/Oct/2020:21:36:58.624234774 -0400] - INFO - bdb_copy_directory - Backing up file 10 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/roomNumber.db) [30/Oct/2020:21:36:58.628137779 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/roomNumber.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/roomNumber.db [30/Oct/2020:21:36:58.636853681 -0400] - INFO - bdb_copy_directory - Backing up file 11 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/DBVERSION) [30/Oct/2020:21:36:58.640244593 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/DBVERSION [30/Oct/2020:21:36:58.644519651 -0400] - INFO - bdb_copy_directory - Backing up file 12 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/uid.db) [30/Oct/2020:21:36:58.648585716 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/uid.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/uid.db [30/Oct/2020:21:36:58.652619382 -0400] - INFO - bdb_copy_directory - Backing up file 13 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/entryrdn.db) [30/Oct/2020:21:36:58.655946013 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/entryrdn.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/entryrdn.db [30/Oct/2020:21:36:58.659421907 -0400] - INFO - bdb_copy_directory - Backing up file 14 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/id2entry.db) [30/Oct/2020:21:36:58.662767515 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/id2entry.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/id2entry.db [30/Oct/2020:21:36:58.666415022 -0400] - INFO - bdb_copy_directory - Backing up file 15 (/var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/objectclass.db) [30/Oct/2020:21:36:58.670146604 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/userRoot/objectclass.db to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/userRoot/objectclass.db [30/Oct/2020:21:36:58.676494792 -0400] - INFO - upgradedb_core - userRoot: Start upgradedb. [30/Oct/2020:21:36:58.681338321 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7503970304, process usage 23879680 [30/Oct/2020:21:36:58.692460987 -0400] - INFO - check_and_set_import_cache - Import allocates 16384KB import cache. [30/Oct/2020:21:36:58.737646414 -0400] - INFO - bdb_import_main - reindex userRoot: Index buffering enabled with bucket size 17 [30/Oct/2020:21:36:59.444014541 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers finished; cleaning up... [30/Oct/2020:21:36:59.648634139 -0400] - INFO - import_monitor_threads - reindex userRoot: Workers cleaned up. [30/Oct/2020:21:36:59.693601169 -0400] - INFO - bdb_import_main - reindex userRoot: Cleaning up producer thread... [30/Oct/2020:21:36:59.698098771 -0400] - INFO - bdb_import_main - reindex userRoot: Indexing complete. Post-processing... [30/Oct/2020:21:36:59.701668422 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numsubordinates (this may take several minutes to complete)... [30/Oct/2020:21:36:59.705629685 -0400] - INFO - bdb_import_main - reindex userRoot: Generating numSubordinates complete. [30/Oct/2020:21:36:59.710055075 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Gathering ancestorid non-leaf IDs... [30/Oct/2020:21:36:59.713733229 -0400] - INFO - bdb_get_nonleaf_ids - reindex userRoot: Finished gathering ancestorid non-leaf IDs. [30/Oct/2020:21:36:59.716933061 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Starting sort of ancestorid non-leaf IDs... [30/Oct/2020:21:36:59.720363583 -0400] - INFO - ldbm_get_nonleaf_ids - reindex userRoot: Finished sort of ancestorid non-leaf IDs. [30/Oct/2020:21:36:59.729032766 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Creating ancestorid index (new idl)... [30/Oct/2020:21:36:59.732550527 -0400] - INFO - bdb_ancestorid_new_idl_create_index - reindex userRoot: Created ancestorid index (new idl). [30/Oct/2020:21:36:59.736037919 -0400] - INFO - bdb_import_main - reindex userRoot: Flushing caches... [30/Oct/2020:21:36:59.739147114 -0400] - INFO - bdb_import_main - reindex userRoot: Closing files... [30/Oct/2020:21:36:59.765783343 -0400] - INFO - bdb_import_main - reindex userRoot: Reindexing complete. Processed 15 entries in 1 seconds. (15.00 entries/sec) [30/Oct/2020:21:36:59.770913881 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/log.0000000001 to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/log.0000000001 [30/Oct/2020:21:36:59.789163217 -0400] - INFO - dblayer_copyfile - Copying /var/lib/dirsrv/slapd-standalone1/db/DBVERSION to /var/lib/dirsrv/slapd-standalone1/bak/reindex_2020-10-30T21:36:58.383021/DBVERSION [30/Oct/2020:21:36:59.793478669 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed suites/import/regression_test.py::test_replay_import_operation 30.41
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:94 Exporting LDIF online... INFO  tests.suites.import.regression_test:regression_test.py:53 Run. INFO  tests.suites.import.regression_test:regression_test.py:104 Importing LDIF online, should raise operation error. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 1 times so far. INFO  tests.suites.import.regression_test:regression_test.py:115 Looping. Tried 2 times so far. INFO  tests.suites.import.regression_test:regression_test.py:57 Adding users. INFO  tests.suites.import.regression_test:regression_test.py:119 Importing LDIF online
Passed suites/import/regression_test.py::test_import_be_default 10.84
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:143 Adding suffix:dc=default,dc=com and backend: default... INFO  tests.suites.import.regression_test:regression_test.py:149 Create LDIF file and import it... INFO  tests.suites.import.regression_test:regression_test.py:154 Stopping the server and running offline import... INFO  tests.suites.import.regression_test:regression_test.py:160 Verifying entry count after import... INFO  tests.suites.import.regression_test:regression_test.py:166 Test PASSED
Passed suites/import/regression_test.py::test_del_suffix_import 7.00
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:183 Adding suffix:dc=importest1,dc=com and backend: importest1 INFO  tests.suites.import.regression_test:regression_test.py:188 Create LDIF file and import it INFO  tests.suites.import.regression_test:regression_test.py:194 Stopping the server and running offline import INFO  tests.suites.import.regression_test:regression_test.py:199 Deleting suffix-dc=importest2,dc=com INFO  tests.suites.import.regression_test:regression_test.py:202 Adding the same database-importest1 after deleting it
Passed suites/import/regression_test.py::test_del_suffix_backend 9.66
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:221 Adding suffix:dc=importest2,dc=com and backend: importest2 INFO  tests.suites.import.regression_test:regression_test.py:226 Create LDIF file and import it INFO  lib389:tasks.py:498 Import task import_10302020_213805 for file /var/lib/dirsrv/slapd-standalone1/ldif/suffix_del2.ldif completed successfully INFO  tests.suites.import.regression_test:regression_test.py:234 Deleting suffix-dc=importest2,dc=com INFO  tests.suites.import.regression_test:regression_test.py:237 Adding the same database-importest2 after deleting it INFO  tests.suites.import.regression_test:regression_test.py:240 Checking if server can be restarted after re-adding the same database
Passed suites/import/regression_test.py::test_import_duplicate_dn 15.30
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:266 Delete the previous error logs INFO  tests.suites.import.regression_test:regression_test.py:269 Create import file INFO  tests.suites.import.regression_test:regression_test.py:293 Import ldif with duplicate entry ERROR  lib389:tasks.py:495 Error: import task import_10302020_213817 for file /var/lib/dirsrv/slapd-standalone1/ldif/data.ldif exited with -23 INFO  tests.suites.import.regression_test:regression_test.py:296 Restart the server to flush the logs INFO  tests.suites.import.regression_test:regression_test.py:299 Error log should not have "unable to flush" message INFO  tests.suites.import.regression_test:regression_test.py:302 Error log should have "Duplicated DN detected" message
Passed suites/import/regression_test.py::test_large_ldif2db_ancestorid_index_creation 630.75
-------------------------------Captured log call--------------------------------
INFO  tests.suites.import.regression_test:regression_test.py:355 Delete the previous errors logs INFO  tests.suites.import.regression_test:regression_test.py:358 Add suffix:o=test and backend: test... INFO  tests.suites.import.regression_test:regression_test.py:371 Create a large nested ldif file using dbgen : /var/lib/dirsrv/slapd-standalone1/ldif/large_nested.ldif INFO  tests.suites.import.regression_test:regression_test.py:374 Stop the server and run offline import... INFO  tests.suites.import.regression_test:regression_test.py:379 Starting the server INFO  tests.suites.import.regression_test:regression_test.py:382 parse the errors logs to check lines with "Starting sort of ancestorid" are present INFO  tests.suites.import.regression_test:regression_test.py:386 parse the errors logs to check lines with "Finished sort of ancestorid" are present INFO  tests.suites.import.regression_test:regression_test.py:390 parse the error logs for the line with "Gathering ancestorid non-leaf IDs" INFO  tests.suites.import.regression_test:regression_test.py:394 parse the error logs for the line with "Created ancestorid index" INFO  tests.suites.import.regression_test:regression_test.py:398 get the ancestorid non-leaf IDs indexing start and end time from the collected strings INFO  tests.suites.import.regression_test:regression_test.py:404 Calculate the elapsed time for the ancestorid non-leaf IDs index creation
Passed suites/indexes/regression_test.py::test_reindex_task_creates_abandoned_index_file 12.86
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:798 Index task index_all_10302020_214914 completed successfully INFO  lib389:tasks.py:798 Index task index_all_10302020_214918 completed successfully INFO  lib389:tasks.py:798 Index task index_all_10302020_214924 completed successfully
Passed suites/lib389/config_compare_test.py::test_config_compare 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_explicit_rdn 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_derive_single_dn 0.02
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_mul_derive_mult_dn 0.15
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_explicit_dn 0.02
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_explicit_rdn 0.02
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_derive_single_dn 0.02
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_derive_mult_dn 0.02
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_invalid_no_basedn 0.00
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_invalid_no_rdn 0.00
No log output captured.
Passed suites/lib389/dsldapobject/dn_construct_test.py::test_sin_non_present_rdn 0.27
No log output captured.
Passed suites/lib389/idm/user_compare_i2_test.py::test_user_compare_i2 0.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/lib389/idm/user_compare_m2Repl_test.py::test_user_compare_m2Repl 1.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 628e48c5-8719-4c05-b133-cf566a3973b9 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d1dd0ad0-b623-477a-8224-10e5e6a342a9 / got description=628e48c5-8719-4c05-b133-cf566a3973b9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e4bf4793-b81e-4f8d-b037-b85502e45dc5 / got description=d1dd0ad0-b623-477a-8224-10e5e6a342a9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/lib389/idm/user_compare_st_test.py::test_user_compare 0.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logexpirationtime-invalid_vals0-valid_vals0] 1.10
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[maxlogsize-invalid_vals1-valid_vals1] 0.12
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logmaxdiskspace-invalid_vals2-valid_vals2] 0.11
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logminfreediskspace-invalid_vals3-valid_vals3] 0.13
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[mode-invalid_vals4-valid_vals4] 0.20
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[maxlogsperdir-invalid_vals5-valid_vals5] 0.13
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationsynchour-invalid_vals6-valid_vals6] 0.36
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationsyncmin-invalid_vals7-valid_vals7] 0.11
No log output captured.
Passed suites/logging/logging_config_test.py::test_logging_digit_config[logrotationtime-invalid_vals8-valid_vals8] 0.12
No log output captured.
Passed suites/mapping_tree/acceptance_test.py::test_invalid_mt 0.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/mapping_tree/be_del_and_default_naming_attr_test.py::test_be_delete 1.16
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
Passed suites/mapping_tree/referral_during_tot_init_test.py::test_referral_during_tot 7.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6bf3ba97-5872-4ff6-a11b-d7f34d69f0c8 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect a6534bd9-2891-4895-bb0b-215253a12d64 / got description=6bf3ba97-5872-4ff6-a11b-d7f34d69f0c8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
------------------------------Captured stderr call------------------------------
[30/Oct/2020:21:52:27.568749074 -0400] - INFO - slapd_exemode_ldif2db - Backend Instance: userRoot
Passed suites/memberof_plugin/regression_test.py::test_memberof_with_repl 77.89
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for hub1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:524 Creating replication topology. INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect bb8a96d8-ec77-4b2e-b72a-92693b8a198a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is working INFO  lib389.replica:replica.py:2211 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6fbc8f65-3244-4f3a-b1b8-d891f5337782 / got description=bb8a96d8-ec77-4b2e-b72a-92693b8a198a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
-------------------------------Captured log call--------------------------------
INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:77 update cn=101,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:77 update cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal
Passed suites/memberof_plugin/regression_test.py::test_scheme_violation_errors_logged 5.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8213923c-706f-4544-8957-bd1dd3ba9ad5 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3f1f81f6-6faa-4a41-83f4-b5c1eae28a94 / got description=8213923c-706f-4544-8957-bd1dd3ba9ad5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:319 memberOf attr value - cn=group1,ou=groups,dc=example,dc=com INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:322 pattern = .*oc_check_allowed_sv.*uid=user_,ou=people,dc=example,dc=com.*memberOf.*not allowed.*
Passed suites/memberof_plugin/regression_test.py::test_memberof_with_changelog_reset 57.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:354 Configure memberof on M1 and M2 INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:365 On M1, add 999 test entries allowing memberof INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:51 Adding 999 users INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:368 On M1, add a group with these 999 entries as members INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:376 Adding the test group using async function INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:386 Check the log messages for error INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:390 Check that the replication is working fine both ways, M1 <-> M2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c21b6d2d-abc4-4adb-ad21-8a1091359dff / got description=3f1f81f6-6faa-4a41-83f4-b5c1eae28a94) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 64f71aa9-642c-4052-ae2c-cac26c60f3d7 / got description=c21b6d2d-abc4-4adb-ad21-8a1091359dff) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 64f71aa9-642c-4052-ae2c-cac26c60f3d7 / got description=c21b6d2d-abc4-4adb-ad21-8a1091359dff) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/memberof_plugin/regression_test.py::test_memberof_group 5.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.memberof_plugin.regression_test:regression_test.py:481 Enable memberof plugin and set the scope as cn=sub1,dc=example,dc=com INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:427 Renaming user (cn=g2,cn=sub2,dc=example,dc=com): new cn=g2-new INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m1,cn=sub1,dc=example,dc=com: memberof->b'cn=g2-new,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g1,cn=sub1,dc=example,dc=com' INFO  lib389:regression_test.py:440 !!!!!!! uid=test_m2,cn=sub1,dc=example,dc=com: memberof->b'cn=g2-new,cn=sub1,dc=example,dc=com'
Passed suites/memberof_plugin/regression_test.py::test_entrycache_on_modrdn_failure 9.31
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user0,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user1,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user2,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user3,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user4,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user5,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user6,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user7,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user8,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:575 Adding user (cn=user9,ou=people,dc=example,dc=com): INFO  lib389:regression_test.py:596 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:596 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:618 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:618 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:633 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:633 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:633 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:633 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:672 retrieve: cn=group_admin,ou=permissions,dc=example,dc=com with desc=None INFO  lib389:regression_test.py:672 retrieve: cn=group_modify,ou=permissions,dc=example,dc=com with desc=None INFO  lib389:regression_test.py:672 retrieve: cn=group_in0,ou=people,dc=example,dc=com with desc=b'mygroup' INFO  lib389:regression_test.py:672 retrieve: cn=group_in1,ou=people,dc=example,dc=com with desc=b'mygroup' INFO  lib389:regression_test.py:672 retrieve: cn=group_out2,dc=example,dc=com with desc=b'this is to check that the entry having this description has the appropriate DN'
Passed suites/memberof_plugin/regression_test.py::test_silent_memberof_failure 10.26
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user0,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user1,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user2,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user3,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user4,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user5,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user6,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user7,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user8,ou=people,dc=example,dc=com): CRITICAL tests.suites.memberof_plugin.regression_test:regression_test.py:738 Adding user (cn=user9,ou=people,dc=example,dc=com): INFO  lib389:regression_test.py:759 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:759 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in0,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:781 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:781 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_out1,dc=example,dc=com') INFO  lib389:regression_test.py:796 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:796 !!!!!!! cn=user0,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:796 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in0,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:796 !!!!!!! cn=user1,ou=people,dc=example,dc=com: memberof->b'cn=group_in1,ou=people,dc=example,dc=com' (vs b'cn=group_in1,ou=people,dc=example,dc=com') INFO  lib389:regression_test.py:833 Should assert cn=user2,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:833 Should assert cn=user3,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:856 Should assert cn=user4,ou=people,dc=example,dc=com has memberof is False INFO  lib389:regression_test.py:856 Should assert cn=user5,ou=people,dc=example,dc=com has memberof is False
Passed suites/monitor/monitor_test.py::test_monitor 0.27
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.monitor.monitor_test:monitor_test.py:44 connection: ['1:20201031015721Z:3:2:-:cn=directory manager:0:0:0:1:ip=127.0.0.1'], currentconnections: ['1'], totalconnections: ['1'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:48 version :: ['1:20201031015721Z:6:5:-:cn=directory manager:0:0:0:1:ip=127.0.0.1'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:52 threads: ['16'],currentconnectionsatmaxthreads: ['0'],maxthreadsperconnhits: ['0'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:56 nbackends: ['1'], backendmonitordn: ['cn=monitor,cn=userRoot,cn=ldbm database,cn=plugins,cn=config'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:60 opsinitiated: ['12'], opscompleted: ['13'] INFO  tests.suites.monitor.monitor_test:monitor_test.py:64 dtablesize: ['1024'],readwaiters: ['0'],entriessent: ['14'],bytessent: ['1099'],currenttime: ['20201031015722Z'],starttime: ['20201031015721Z']
Passed suites/monitor/monitor_test.py::test_monitor_ldbm 0.03
No log output captured.
Passed suites/monitor/monitor_test.py::test_monitor_backend 0.01
No log output captured.
Passed suites/openldap_2_389/migrate_test.py::test_parse_openldap_slapdd 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing.
Passed suites/openldap_2_389/migrate_test.py::test_migrate_openldap_slapdd 23.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
==== migration plan ==== SchemaAttributeCreate -> ('pseudonym',) SchemaAttributeCreate -> ('email', 'emailAddress', 'pkcs9email') SchemaAttributeCreate -> ('textEncodedORAddress',) SchemaAttributeUnsupported -> ('otherMailbox',) SchemaAttributeCreate -> ('aRecord',) SchemaAttributeCreate -> ('mDRecord',) SchemaAttributeCreate -> ('mXRecord',) SchemaAttributeCreate -> ('nSRecord',) SchemaAttributeCreate -> ('sOARecord',) SchemaAttributeCreate -> ('cNAMERecord',) SchemaAttributeCreate -> ('janetMailbox',) SchemaAttributeCreate -> ('mailPreferenceOption',) SchemaAttributeUnsupported -> ('dSAQuality',) SchemaAttributeUnsupported -> ('singleLevelQuality',) SchemaAttributeUnsupported -> ('subtreeMinimumQuality',) SchemaAttributeUnsupported -> ('subtreeMaximumQuality',) SchemaAttributeCreate -> ('personalSignature',) SchemaAttributeCreate -> ('suseDefaultBase',) SchemaAttributeCreate -> ('suseNextUniqueId',) SchemaAttributeCreate -> ('suseMinUniqueId',) SchemaAttributeCreate -> ('suseMaxUniqueId',) SchemaAttributeCreate -> ('suseDefaultTemplate',) SchemaAttributeCreate -> ('suseSearchFilter',) SchemaAttributeCreate -> ('suseDefaultValue',) SchemaAttributeCreate -> ('suseNamingAttribute',) SchemaAttributeCreate -> ('suseSecondaryGroup',) SchemaAttributeCreate -> ('suseMinPasswordLength',) SchemaAttributeCreate -> ('suseMaxPasswordLength',) SchemaAttributeCreate -> ('susePasswordHash',) SchemaAttributeCreate -> ('suseSkelDir',) SchemaAttributeCreate -> ('susePlugin',) SchemaAttributeCreate -> ('suseMapAttribute',) SchemaAttributeCreate -> ('suseImapServer',) SchemaAttributeCreate -> ('suseImapAdmin',) SchemaAttributeCreate -> ('suseImapDefaultQuota',) SchemaAttributeCreate -> ('suseImapUseSsl',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.4 ('pilotPerson', 'newPilotPerson') may -> ('userid', 'textEncodedORAddress', 'rfc822Mailbox', 'favouriteDrink', 'roomNumber', 'userClass', 'homeTelephoneNumber', 'homePostalAddress', 'secretary', 'personalTitle', 'preferredDeliveryMethod', 'businessCategory', 'janetMailbox', 'otherMailbox', 'mobileTelephoneNumber', 'pagerTelephoneNumber', 'organizationalStatus', 'mailPreferenceOption', 'personalSignature') must -> () sup -> ('person',) SchemaClassCreate -> 0.9.2342.19200300.100.4.15 ('dNSDomain',) may -> ('ARecord', 'MDRecord', 'MXRecord', 'NSRecord', 'SOARecord', 'CNAMERecord') must -> () sup -> ('domain',) SchemaClassCreate -> 0.9.2342.19200300.100.4.20 ('pilotOrganization',) may -> ('buildingName',) must -> () sup -> ('organization', 'organizationalUnit') SchemaClassUnsupported -> 0.9.2342.19200300.100.4.21 ('pilotDSA',) may -> ('dSAQuality',) must -> () sup -> ('dsa',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.22 ('qualityLabelledData',) may -> ('subtreeMinimumQuality', 'subtreeMaximumQuality') must -> ('dsaQuality',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:2 ('suseModuleConfiguration',) may -> ('suseDefaultBase',) must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:3 ('suseUserConfiguration',) may -> ('suseMinPasswordLength', 'suseMaxPasswordLength', 'susePasswordHash', 'suseSkelDir', 'suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:4 ('suseObjectTemplate',) may -> ('susePlugin', 'suseDefaultValue', 'suseNamingAttribute') must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:5 ('suseUserTemplate',) may -> ('suseSecondaryGroup',) must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:6 ('suseGroupTemplate',) may -> () must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:7 ('suseGroupConfiguration',) may -> ('suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:8 ('suseCaConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:9 ('suseDnsConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:10 ('suseDhcpConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:11 ('suseMailConfiguration',) may -> () must -> ('suseImapServer', 'suseImapAdmin', 'suseImapDefaultQuota', 'suseImapUseSsl') sup -> ('suseModuleConfiguration',) DatabaseReindex -> dc=example,dc=com PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=com PluginMemberOfFixup -> dc=example,dc=com PluginRefintEnable PluginRefintAttributes -> member PluginRefintAttributes -> memberOf PluginRefintScope -> dc=example,dc=com PluginUniqueConfigure -> dc=example,dc=com, mail 401a528e-eaf5-1039-8667-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=com, uid 401a528e-eaf5-1039-8667-dbfbf2f5e6dd DatabaseCreate -> dc=example,dc=net, 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseIndexCreate -> objectClass eq, dc=example,dc=net DatabaseReindex -> dc=example,dc=net PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=net PluginMemberOfFixup -> dc=example,dc=net PluginUniqueConfigure -> dc=example,dc=net, mail 401a7084-eaf5-1039-866c-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=net, uid 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseLdifImport -> dc=example,dc=com /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_com.slapcat.ldif DatabaseLdifImport -> dc=example,dc=net /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_net.slapcat.ldif ==== end migration plan ====
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing. INFO  lib389.migrate.plan:plan.py:656 migration: 1 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 2 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 3 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 4 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 5 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 6 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 7 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 8 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 9 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 10 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 11 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 12 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 13 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 14 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 15 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 16 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 17 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 18 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 19 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 20 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 21 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 22 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 23 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 24 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 25 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 26 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 27 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 28 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 29 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 30 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 31 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 32 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 33 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 34 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 35 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 36 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 37 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 38 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 39 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 40 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 41 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 42 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 43 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 44 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 45 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 46 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 47 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 48 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 49 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 50 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 51 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 52 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 53 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 54 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 55 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 56 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 57 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 58 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 59 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 60 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 61 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 62 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 63 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 64 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 65 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 66 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 67 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 68 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 69 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 70 / 71 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 71 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 1 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 2 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 3 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 4 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 5 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 6 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 7 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 8 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 9 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 10 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 11 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 12 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 13 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 14 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 15 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 16 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 17 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 18 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 19 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 20 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 21 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 22 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 23 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 24 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 25 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 26 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 27 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 28 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 29 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 30 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 31 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 32 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 33 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 34 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 35 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 36 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 37 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 38 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 39 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 40 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 41 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 42 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 43 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 44 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 45 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 46 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 47 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 48 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 49 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 50 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 51 / 71 complete ... ERROR  lib389:tasks.py:795 Error: index task index_all_10302020_215754 exited with -1 INFO  lib389.migrate.plan:plan.py:663 post: 52 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 53 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 54 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 55 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 56 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 57 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 58 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 59 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 60 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 61 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 62 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 63 / 71 complete ... ERROR  lib389:tasks.py:795 Error: index task index_all_10302020_215756 exited with -1 INFO  lib389.migrate.plan:plan.py:663 post: 64 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 65 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 66 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 67 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 68 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 69 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 70 / 71 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 71 / 71 complete ...
Passed suites/openldap_2_389/migrate_test.py::test_migrate_openldap_slapdd_skip_elements 14.75
------------------------------Captured stdout call------------------------------
==== migration plan ==== SchemaAttributeUnsupported -> ('otherMailbox',) SchemaAttributeUnsupported -> ('dSAQuality',) SchemaAttributeUnsupported -> ('singleLevelQuality',) SchemaAttributeUnsupported -> ('subtreeMinimumQuality',) SchemaAttributeUnsupported -> ('subtreeMaximumQuality',) SchemaAttributeCreate -> ('suseDefaultBase',) SchemaAttributeCreate -> ('suseNextUniqueId',) SchemaAttributeCreate -> ('suseMinUniqueId',) SchemaAttributeCreate -> ('suseMaxUniqueId',) SchemaAttributeCreate -> ('suseDefaultTemplate',) SchemaAttributeCreate -> ('suseSearchFilter',) SchemaAttributeCreate -> ('suseDefaultValue',) SchemaAttributeCreate -> ('suseNamingAttribute',) SchemaAttributeCreate -> ('suseSecondaryGroup',) SchemaAttributeCreate -> ('suseMinPasswordLength',) SchemaAttributeCreate -> ('suseMaxPasswordLength',) SchemaAttributeCreate -> ('susePasswordHash',) SchemaAttributeCreate -> ('suseSkelDir',) SchemaAttributeCreate -> ('susePlugin',) SchemaAttributeCreate -> ('suseMapAttribute',) SchemaAttributeCreate -> ('suseImapServer',) SchemaAttributeCreate -> ('suseImapAdmin',) SchemaAttributeCreate -> ('suseImapDefaultQuota',) SchemaAttributeCreate -> ('suseImapUseSsl',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.4 ('pilotPerson', 'newPilotPerson') may -> ('userid', 'textEncodedORAddress', 'rfc822Mailbox', 'favouriteDrink', 'roomNumber', 'userClass', 'homeTelephoneNumber', 'homePostalAddress', 'secretary', 'personalTitle', 'preferredDeliveryMethod', 'businessCategory', 'janetMailbox', 'otherMailbox', 'mobileTelephoneNumber', 'pagerTelephoneNumber', 'organizationalStatus', 'mailPreferenceOption', 'personalSignature') must -> () sup -> ('person',) SchemaClassInconsistent -> ( 0.9.2342.19200300.100.4.20 NAME 'pilotOrganization' SUP organization STRUCTURAL MAY buildingName X-ORIGIN 'user defined' ) to 0.9.2342.19200300.100.4.20 ('pilotOrganization',) may -> ('buildingName',) must -> () sup -> ('organization', 'organizationalUnit') SchemaClassUnsupported -> 0.9.2342.19200300.100.4.21 ('pilotDSA',) may -> ('dSAQuality',) must -> () sup -> ('dsa',) SchemaClassUnsupported -> 0.9.2342.19200300.100.4.22 ('qualityLabelledData',) may -> ('subtreeMinimumQuality', 'subtreeMaximumQuality') must -> ('dsaQuality',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:2 ('suseModuleConfiguration',) may -> ('suseDefaultBase',) must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:3 ('suseUserConfiguration',) may -> ('suseMinPasswordLength', 'suseMaxPasswordLength', 'susePasswordHash', 'suseSkelDir', 'suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:4 ('suseObjectTemplate',) may -> ('susePlugin', 'suseDefaultValue', 'suseNamingAttribute') must -> ('cn',) sup -> ('top',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:5 ('suseUserTemplate',) may -> ('suseSecondaryGroup',) must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:6 ('suseGroupTemplate',) may -> () must -> ('cn',) sup -> ('suseObjectTemplate',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:7 ('suseGroupConfiguration',) may -> ('suseNextUniqueId', 'suseMinUniqueId', 'suseMaxUniqueId', 'suseDefaultTemplate', 'suseSearchFilter', 'suseMapAttribute') must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:8 ('suseCaConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:9 ('suseDnsConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:10 ('suseDhcpConfiguration',) may -> () must -> () sup -> ('suseModuleConfiguration',) SchemaClassCreate -> SUSE.YaST.ModuleConfig.OC:11 ('suseMailConfiguration',) may -> () must -> ('suseImapServer', 'suseImapAdmin', 'suseImapDefaultQuota', 'suseImapUseSsl') sup -> ('suseModuleConfiguration',) DatabaseReindex -> dc=example,dc=com PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=com PluginMemberOfFixup -> dc=example,dc=com PluginRefintEnable PluginRefintAttributes -> member PluginRefintAttributes -> memberOf PluginRefintScope -> dc=example,dc=com PluginUniqueConfigure -> dc=example,dc=com, mail 401a528e-eaf5-1039-8667-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=com, uid 401a528e-eaf5-1039-8667-dbfbf2f5e6dd DatabaseReindex -> dc=example,dc=net PluginMemberOfEnable PluginMemberOfScope -> dc=example,dc=net PluginMemberOfFixup -> dc=example,dc=net PluginUniqueConfigure -> dc=example,dc=net, mail 401a7084-eaf5-1039-866c-dbfbf2f5e6dd PluginUniqueConfigure -> dc=example,dc=net, uid 401a7084-eaf5-1039-866c-dbfbf2f5e6dd DatabaseLdifImport -> dc=example,dc=com /export/tests/suites/openldap_2_389/../../data/openldap_2_389/1/example_com.slapcat.ldif ==== end migration plan ====
-------------------------------Captured log call--------------------------------
INFO  lib389.migrate.openldap.config:config.py:264 Examining OpenLDAP Configuration ... INFO  lib389.migrate.openldap.config:config.py:285 Completed OpenLDAP Configuration Parsing. INFO  lib389.migrate.plan:plan.py:656 migration: 1 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 2 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 3 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 4 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 5 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 6 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 7 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 8 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 9 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 10 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 11 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 12 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 13 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 14 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 15 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 16 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 17 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 18 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 19 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 20 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 21 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 22 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 23 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 24 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 25 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 26 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 27 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 28 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 29 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 30 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 31 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 32 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 33 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 34 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 35 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 36 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 37 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 38 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 39 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 40 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 41 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 42 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 43 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 44 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 45 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 46 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 47 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 48 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 49 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 50 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 51 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 52 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 53 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 54 / 55 complete ... INFO  lib389.migrate.plan:plan.py:656 migration: 55 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 1 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 2 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 3 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 4 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 5 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 6 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 7 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 8 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 9 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 10 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 11 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 12 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 13 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 14 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 15 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 16 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 17 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 18 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 19 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 20 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 21 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 22 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 23 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 24 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 25 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 26 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 27 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 28 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 29 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 30 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 31 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 32 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 33 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 34 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 35 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 36 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 37 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 38 / 55 complete ... ERROR  lib389:tasks.py:795 Error: index task index_all_10302020_215809 exited with -1 INFO  lib389.migrate.plan:plan.py:663 post: 39 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 40 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 41 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 42 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 43 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 44 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 45 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 46 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 47 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 48 / 55 complete ... ERROR  lib389:tasks.py:795 Error: index task index_all_10302020_215811 exited with -1 INFO  lib389.migrate.plan:plan.py:663 post: 49 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 50 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 51 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 52 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 53 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 54 / 55 complete ... INFO  lib389.migrate.plan:plan.py:663 post: 55 / 55 complete ...
Passed suites/paged_results/paged_results_test.py::test_search_success[6-5] 0.49
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:56 Adding user simplepaged_test
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 5 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 6; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d0505f10>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 5 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 5 users
Passed suites/paged_results/paged_results_test.py::test_search_success[5-5] 0.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 5 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d01c6ca0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 5 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 5 users
Passed suites/paged_results/paged_results_test.py::test_search_success[5-25] 1.32
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 25 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:247 Set user bind simplepaged_test INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d07ac550>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:253 25 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 25 users
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[50-200-cn=config,cn=ldbm database,cn=plugins,cn=config-nsslapd-idlistscanlimit-100-UNWILLING_TO_PERFORM] 9.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 200 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 100. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 200 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[5-15-cn=config-nsslapd-timelimit-20-UNAVAILABLE_CRITICAL_EXTENSION] 30.85
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 15 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-timelimit to 20. Previous value - b'3600'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 15 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-timelimit to b'3600'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[21-50-cn=config-nsslapd-sizelimit-20-SIZELIMIT_EXCEEDED] 2.68
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 20. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[21-50-cn=config-nsslapd-pagedsizelimit-5-SIZELIMIT_EXCEEDED] 2.97
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_limits_fail[5-50-cn=config,cn=ldbm database,cn=plugins,cn=config-nsslapd-lookthroughlimit-20-ADMINLIMIT_EXCEEDED] 2.68
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 20. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:299 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:302 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:308 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:320 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'20'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_sort_success 2.49
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 50 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:378 Initiate ldapsearch with created control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:379 Collect data with sorting INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3c649d0>, <lib389._controls.SSSRequestControl object at 0x7f61c3c64d60>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:384 Substring numbers from user DNs INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:388 Assert that list is sorted INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 50 users
Passed suites/paged_results/paged_results_test.py::test_search_abandon 5.59
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:419 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:422 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:426 Initiate a search with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:429 Abandon the search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:432 Expect an ldap.TIMEOUT exception, while trying to get the search results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_search_with_timelimit 34.94
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:468 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:471 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:476 Iteration 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:499 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:476 Iteration 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:499 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:476 Iteration 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:483 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:499 Done with this search - sleeping 10 seconds INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_dns_ip_aci[fqdn] 5.43
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:547 Back up current suffix ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:550 Add test ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:556 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:559 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:563 Initiate three searches with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 1 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d0a790a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 2 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d0a790a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 3 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d0a790a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:570 If we are here, then no error has happened. We are good. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:573 Restore ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_dns_ip_aci[ip] 5.65
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:547 Back up current suffix ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:550 Add test ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:556 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:559 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:563 Initiate three searches with a paged results control INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 1 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d17360a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 2 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d17360a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:565 3 search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 5; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d17360a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 8 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 9 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 10 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 11 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 12 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 13 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 14 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 15 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 16 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 17 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 18 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 19 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:568 100 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:570 If we are here, then no error has happened. We are good. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:573 Restore ACI INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_multiple_paging 5.95
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:606 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:609 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:614 Iteration 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:614 Iteration 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:614 Iteration 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_invalid_cookie[1000] 5.46
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:661 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:664 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:672 Put an invalid cookie (1000) to the control. TypeError is expected INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_invalid_cookie[-1] 6.39
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 100 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:661 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:664 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:672 Put an invalid cookie (-1) to the control. TypeError is expected INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 100 users
Passed suites/paged_results/paged_results_test.py::test_search_abandon_with_zero_size 0.73
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:705 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:708 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_search_pagedsizelimit_success 0.66
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 20. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:754 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c38ad2b0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:762 10 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'20'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_nspagedsizelimit[5-15-PASS] 0.56
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to 15. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:817 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:828 Expect to pass INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3724760>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:830 10 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to None. Previous value - b'15'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_search_nspagedsizelimit[15-5-SIZELIMIT_EXCEEDED] 0.61
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 15. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to 5. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:817 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:824 Expect to fail with SIZELIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3da3160>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'15'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedSizeLimit to None. Previous value - b'5'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_search_paged_limits[conf_attr_values0-ADMINLIMIT_EXCEEDED] 5.51
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to 5000. Previous value - b'2000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to 5000. Previous value - b'0'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 100. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 100. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:889 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:896 Expect to fail with ADMINLIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3785e20>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-sizelimit to b'2000'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-pagedsizelimit to b'0'. Previous value - b'5000'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'100'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config.
Passed suites/paged_results/paged_results_test.py::test_search_paged_user_limits[conf_attr_values0-ADMINLIMIT_EXCEEDED] 5.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to 1000. Previous value - b'5000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to 1000. Previous value - b'4000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to 100. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to 100. Previous value - None. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:963 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:970 Expect to fail with ADMINLIMIT_EXCEEDED INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 10; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d0a79a30>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 6 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 7 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 101 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-lookthroughlimit to b'5000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-idlistscanlimit to b'4000'. Previous value - b'1000'. Modified suffix - cn=config,cn=ldbm database,cn=plugins,cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedIDListScanLimit to None. Previous value - b'100'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsPagedLookthroughLimit to None. Previous value - b'100'. Modified suffix - uid=simplepaged_test,ou=People,dc=example,dc=com.
Passed suites/paged_results/paged_results_test.py::test_ger_basic 1.55
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c3b825b0>, <ldap.controls.simple.GetEffectiveRightsControl object at 0x7f61c3725160>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1014 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1016 Check for attributeLevelRights INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1019 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users
Passed suites/paged_results/paged_results_test.py::test_multi_suffix_search 7.70
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:87 Adding suffix:o=test_parent and backend: parent_base INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:96 Adding ACI to allow our test user to search INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:108 Adding suffix:ou=child,o=test_parent and backend: child_base
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1050 Clear the access log INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: o=test_parent; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d64d1700>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 5 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1061 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1064 Restart the server to flush the logs INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1070 Assert that last pr_cookie == -1 and others pr_cookie == 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1075 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 10 users
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[None] 1.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1108 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c37246a0>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1115 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1118 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[-1] 1.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to -1. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1108 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61d1756580>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1115 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1118 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'-1'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_success[1000] 1.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 1000. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1108 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:191 Running simple paged result search with - search suffix: dc=example,dc=com; filter: (uid=test*); attr list ['dn', 'sn']; page_size = 4; controls: [<ldap.controls.libldap.SimplePagedResultsControl object at 0x7f61c38aaf10>]. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 0 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 1 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 2 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 3 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:199 Getting page 4 INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1115 20 results INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1118 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'1000'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_failure[0] 1.28
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 0. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1153 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1156 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1171 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'0'. Modified suffix - cn=config.
Passed suites/paged_results/paged_results_test.py::test_maxsimplepaged_per_conn_failure[1] 1.36
-------------------------------Captured log call--------------------------------
INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:133 Adding 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to 1. Previous value - b'-1'. Modified suffix - cn=config. INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1153 Set user bind INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1156 Create simple paged results control instance INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:1171 Remove added users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:155 Deleting 20 users INFO  tests.suites.paged_results.paged_results_test:paged_results_test.py:169 Set nsslapd-maxsimplepaged-per-conn to b'-1'. Previous value - b'1'. Modified suffix - cn=config.
Passed suites/password/password_policy_test.py::test_password_change_section 1.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/password_policy_test.py::test_password_syntax_section 3.12
No log output captured.
Passed suites/password/password_policy_test.py::test_password_history_section 8.88
No log output captured.
Passed suites/password/password_policy_test.py::test_password_minimum_age_section 11.53
No log output captured.
Passed suites/password/password_policy_test.py::test_account_lockout_and_lockout_duration_section 6.47
No log output captured.
Passed suites/password/password_policy_test.py::test_grace_limit_section 17.89
No log output captured.
Passed suites/password/password_policy_test.py::test_additional_corner_cases 1.20
No log output captured.
Passed suites/password/password_test.py::test_password_delete_specific_password 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.password_test:password_test.py:44 Running test_password_delete_specific_password... INFO  tests.suites.password.password_test:password_test.py:65 test_password_delete_specific_password: PASSED
Passed suites/password/pbkdf2_upgrade_plugin_test.py::test_pbkdf2_upgrade 9.40
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_bypass 0.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:46 test_pwdAdmin_init: Creating Password Administrator entries... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:85 test_pwdAdmin_init: Configuring password policy... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:100 Add aci to allow password admin to add/update entries... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:115 test_pwdAdmin_init: Bind as the Password Administrator (before activating)... INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:128 test_pwdAdmin_init: Attempt to add entries with invalid passwords, these adds should fail...
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdAdmin_test:pwdAdmin_test.py:167 test_pwdAdmin: Activate the Password Administator...
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_no_admin 0.09
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_modify 0.17
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_group 0.15
No log output captured.
Passed suites/password/pwdAdmin_test.py::test_pwdAdmin_config_validation 0.02
No log output captured.
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_different_operation 9.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:101 Attempt for Password change for an entry that does not exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:103 Attempt for Password change for an entry that exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:105 Attempt for Password change to old for an entry that exists INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:107 Attempt for Password Change with Binddn as testuser but with wrong old password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:111 Attempt for Password Change with Binddn as testuser INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:113 Attempt for Password Change without giving newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:116 Change password to NEW_PASSWD i.e newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:119 Check binding with old/new password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:124 Change password back to OLD_PASSWD i.e password INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:127 Checking password change Operation using a Non-Secure connection INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:131 Testuser attempts to change password for testuser2(userPassword attribute is Set) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:147 Directory Manager attempts to change password for testuser2(userPassword attribute is Set) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:150 Changing userPassword attribute to Undefined for testuser2 INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:152 Testuser attempts to change password for testuser2(userPassword attribute is Undefined) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:156 Directory Manager attempts to change password for testuser2(userPassword attribute is Undefined) INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:159 Create a password syntax policy. Attempt to change to password that violates that policy INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:163 Reset password syntax policy INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:165 userPassword mod with control results in ber decode error INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:169 Reseting the testuser's password
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_password_policy 0.13
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:44 Change the pwd storage type to clear and change the password once to refresh it(for the rest of tests
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:192 Change the password and check that a new entry has been added to the history INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:196 Try changing password to one stored in history. Should fail INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:199 Change the password several times in a row, and try binding after each change
Passed suites/password/pwdModify_test.py::test_pwd_modify_with_subsuffix 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:228 Add a new SubSuffix INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:235 Add the container & create password policies INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:243 Add two New users under the SubEntry INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:265 Changing password of uid=test_user0,ou=TestPeople_bug834047,dc=example,dc=com to newpassword INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:270 Try to delete password- case when password is specified INFO  tests.suites.password.pwdModify_test:pwdModify_test.py:274 Try to delete password- case when password is not specified
Passed suites/password/pwdPolicy_attribute_test.py::test_pwd_reset 1.76
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:39 Adding test user {}
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[on-off-UNWILLING_TO_PERFORM] 2.33
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:66 Create password policy for subtree ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:69 Create password policy for user uid=simplepaged_test,ou=people,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "on" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "off" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[off-off-UNWILLING_TO_PERFORM] 2.09
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "off" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "off" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[off-on-False] 2.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "off" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "on" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_change_pwd[on-on-False] 2.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:154 Set passwordChange to "on" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:161 Set passwordChange to "on" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:170 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:182 Bind as DM
Passed suites/password/pwdPolicy_attribute_test.py::test_pwd_min_age 14.47
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:219 Set passwordminage to "10" - ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:224 Set passwordminage to "10" - uid=simplepaged_test,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:228 Set passwordminage to "10" - cn=config INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:233 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:239 Bind as user and modify userPassword straight away after previous change INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:244 Wait 12 second INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:248 Bind as user and modify userPassword INFO  tests.suites.password.pwdPolicy_attribute_test:pwdPolicy_attribute_test.py:256 Bind as DM
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_must_change 2.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:109 Configure password policy with paswordMustChange set to "on" INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:120 Reset userpassword as Directory Manager INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:125 Bind should return ctrl with error code 2 (changeAfterReset)
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expired_grace_limit 6.29
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:159 Configure password policy with grace limit set tot 2 INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:164 Change password and wait for it to expire INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:168 Bind and use up one grace login (only one left) INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:178 Use up last grace login, should get control INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:184 No grace login available, bind should fail, and no control should be returned
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expiring_with_warning 5.31
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:206 Configure password policy INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:211 Change password and get controls INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:222 Warning has been sent, try the bind again, and recheck the expiring time
Passed suites/password/pwdPolicy_controls_test.py::test_pwd_expiring_with_no_warning 6.30
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:250 Configure password policy INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:255 When the warning is less than the max age, we never send expiring control response INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:262 Turn on sending expiring control regardless of warning INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:274 Check expiring time again INFO  tests.suites.password.pwdPolicy_controls_test:pwdPolicy_controls_test.py:285 Turn off sending expiring control (restore the default setting)
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[off-off] 1.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:39 Adding user uid=buser,ou=People,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:46 Adding an aci for the bind user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:71 Enable fine-grained policy INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[on-off] 1.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_no_restrictions[off-on] 1.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:119 Set nsslapd-pwpolicy-inherit-global to off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:120 Set passwordCheckSyntax to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is off INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:129 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:132 Make sure an entry added to ou=people has no password syntax restrictions.
Passed suites/password/pwdPolicy_inherit_global_test.py::test_entry_has_restrictions 1.31
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:173 Set nsslapd-pwpolicy-inherit-global to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:174 Set passwordCheckSyntax to on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of nsslapd-pwpolicy-inherit-global is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:92 Default value of passwordCheckSyntax is on INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:187 Bind as test user INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:192 Try to add user with a short password (<9) INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:197 Try to add user with a long password (>9) INFO  tests.suites.password.pwdPolicy_inherit_global_test:pwdPolicy_inherit_global_test.py:201 Bind as DM user
Passed suites/password/pwdPolicy_syntax_test.py::test_basic 6.23
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:34 Enable global password policy. Check for syntax.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinLength: length too short INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinDigits: does not contain minimum number of digits INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinAlphas: does not contain minimum number of alphas INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxRepeats: too many repeating characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinSpecials: does not contain minimum number of special characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinLowers: does not contain minimum number of lowercase characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMinUppers: does not contain minimum number of lowercase characters INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordDictCheck: Password found in dictionary INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordPalindrome: Password is palindrome INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxSequence: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxSeqSets: Max monotonic sequence is not allowed INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxClassChars: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxClassChars: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxClassChars: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordMaxClassChars: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry
Passed suites/password/pwdPolicy_syntax_test.py::test_config_set_few_user_attributes 7.15
-------------------------------Captured log call--------------------------------
INFO  lib389:pwdPolicy_syntax_test.py:315 Set passwordUserAttributes to "description loginShell" INFO  lib389:pwdPolicy_syntax_test.py:319 Verify passwordUserAttributes has the values INFO  lib389:pwdPolicy_syntax_test.py:323 Reset passwordUserAttributes INFO  lib389:pwdPolicy_syntax_test.py:326 Verify passwordUserAttributes enforced the policy INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordUserAttributes: Password found in user entry
Passed suites/password/pwdPolicy_syntax_test.py::test_config_set_few_bad_words 6.32
-------------------------------Captured log call--------------------------------
INFO  lib389:pwdPolicy_syntax_test.py:355 Set passwordBadWords to "fedora redhat" INFO  lib389:pwdPolicy_syntax_test.py:360 Verify passwordBadWords has the values INFO  lib389:pwdPolicy_syntax_test.py:364 Reset passwordBadWords INFO  lib389:pwdPolicy_syntax_test.py:367 Verify passwordBadWords enforced the policy INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class INFO  tests.suites.password.pwdPolicy_syntax_test:pwdPolicy_syntax_test.py:106 Invalid password correctly rejected by passwordBadWords: Too may consecutive characters from the same class
Passed suites/password/pwdPolicy_token_test.py::test_token_lengths 3.52
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 4 token (test) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 12, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'} INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 6 token (test_u) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 16, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'} INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:67 Testing password len 10 token (test_user1) INFO  tests.suites.password.pwdPolicy_token_test:pwdPolicy_token_test.py:72 Password correctly rejected: {'msgtype': 103, 'msgid': 20, 'result': 19, 'desc': 'Constraint violation', 'ctrls': [], 'info': 'invalid password syntax - password based off of user entry'}
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[ ] 0.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:205 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:209 An invalid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:213 Now check the value is unchanged INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:216 Invalid value was rejected correctly
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[junk123] 0.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:205 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:209 An invalid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to junk123 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:213 Now check the value is unchanged INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:216 Invalid value junk123 was rejected correctly
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[on] 1.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:205 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:218 A valid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to on INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:221 Now check that the value has been changed INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:224 passwordSendExpiringTime is now set to on INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:226 Set passwordSendExpiringTime back to the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to off
Passed suites/password/pwdPolicy_warning_test.py::test_different_values[off] 1.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:205 Get the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:218 A valid value is being tested INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:221 Now check that the value has been changed INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:224 passwordSendExpiringTime is now set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:226 Set passwordSendExpiringTime back to the default value INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to off
Passed suites/password/pwdPolicy_warning_test.py::test_expiry_time 0.04
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:257 Get the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:258 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:262 Check whether the time is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:265 user's password will expire in 172800 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:268 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_password_warning[passwordSendExpiringTime-off] 0.54
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:300 Set configuration parameter INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordSendExpiringTime to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:303 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:307 Check the state of the control INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:309 Password Expiry warning time is not returned as passwordSendExpiringTime is set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:315 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_password_warning[passwordWarning-3600] 0.54
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:300 Set configuration parameter INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:171 Setting passwordWarning to 3600 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:303 Binding with (uid=tuser,ou=people,dc=example,dc=com) and requesting password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:307 Check the state of the control INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:312 (uid=tuser,ou=people,dc=example,dc=com) password will expire in 172799 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:315 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_with_different_password_states 0.10
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:348 Expire user's password by changing passwordExpirationTime timestamp INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:352 Old passwordExpirationTime: 20201102020559Z INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:355 New passwordExpirationTime: 20201002020559Z INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:358 Attempting to bind with user uid=tuser,ou=people,dc=example,dc=com and retrive the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:362 Bind Failed, error: <ExceptionInfo INVALID_CREDENTIALS({'msgtype': 97, 'msgid': 73, 'result': 49, 'desc': 'Invalid credentials', 'ctrls': [('1.3.6.1.4.1.....8.5.1', 0, b'0\x84\x00\x00\x00\x03\x81\x01\x00'), ('2.16.840.1.113730.3.4.4', 0, b'0')], 'info': 'password expired!'}) tblen=10> INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:364 Rebinding as DM INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:367 Reverting back user's passwordExpirationTime INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:371 Rebinding with uid=tuser,ou=people,dc=example,dc=com and retrieving the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:374 Check that the control is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:377 user's password will expire in 172800 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:380 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_default_behavior 0.03
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:84 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:90 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:406 Binding with uid=tuser,ou=people,dc=example,dc=com and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:410 Check that no control is returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:413 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_when_maxage_and_warning_are_the_same 2.09
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:84 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:90 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:442 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:446 First change user's password to reset its password expiration time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:453 Binding with uid=tuser,ou=people,dc=example,dc=com and requesting the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:457 Check that control is returned evenif passwordSendExpiringTime is set to off INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:461 user's password will expire in 86400 seconds INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:463 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_with_local_policy 0.03
-----------------------------Captured stdout setup------------------------------
Successfully created user password policy
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:113 Add the user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:141 Setting fine grained policy for user (uid=tuser,ou=people,dc=example,dc=com)
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:490 Attempting to get password expiry warning time for user uid=tuser,ou=people,dc=example,dc=com INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:158 Bind with the user and request the password expiry warning time INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:493 Check that the control is not returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:496 Password expiry warning time is not returned INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:498 Rebinding as DM
Passed suites/password/pwdPolicy_warning_test.py::test_search_shadowWarning_when_passwordWarning_is_lower 0.37
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:46 Get the default values INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:53 Set the new values
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:530 Bind as cn=Directory Manager INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:533 Creating test user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:538 Setting passwordWarning to smaller value than 86400 INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:541 Bind as test user INFO  tests.suites.password.pwdPolicy_warning_test:pwdPolicy_warning_test.py:544 Check if attribute shadowWarning is present
Passed suites/password/pwdPolicy_warning_test.py::test_password_expire_works 1.67
No log output captured.
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CLEAR] 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CLEAR PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-MD5] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-MD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-SHA256] 0.09
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[CRYPT-SHA512] 0.09
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test CRYPT-SHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[MD5] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test MD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA] 0.08
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA256] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA384] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA384 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SHA512] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SMD5] 0.09
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SMD5 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA256] 0.18
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA384] 0.11
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA384 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[SSHA512] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test SSHA512 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[PBKDF2_SHA256] 0.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test PBKDF2_SHA256 PASSED
Passed suites/password/pwd_algo_test.py::test_pwd_algo_test[DEFAULT] 0.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:138 Test DEFAULT PASSED
Passed suites/password/pwd_algo_test.py::test_pbkdf2_algo 2.77
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_algo_test:pwd_algo_test.py:170 Test PASSED
Passed suites/password/pwd_lockout_bypass_test.py::test_lockout_bypass 0.44
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwd_log_test.py::test_hide_unhashed_pwd 4.20
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwd_log_test:pwd_log_test.py:79 Test complete
Passed suites/password/pwp_gracel_test.py::test_password_gracelimit_section 13.79
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwp_history_test.py::test_history_is_not_overwritten 4.54
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:77 Configured password policy.
Passed suites/password/pwp_history_test.py::test_basic 7.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:152 Configured password policy. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:170 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:201 Correct number of passwords found in history. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:212 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:222 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:232 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:254 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:267 Configured passwordInHistory to 0. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:283 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:299 Configured passwordInHistory to 2. INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:312 Password change correctly rejected INFO  tests.suites.password.pwp_history_test:pwp_history_test.py:326 Test suite PASSED.
Passed suites/password/pwp_test.py::test_passwordchange_to_no 0.12
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/pwp_test.py::test_password_check_syntax 0.38
No log output captured.
Passed suites/password/pwp_test.py::test_too_big_password 0.41
No log output captured.
Passed suites/password/pwp_test.py::test_pwminage 3.13
No log output captured.
Passed suites/password/pwp_test.py::test_invalid_credentials 7.39
No log output captured.
Passed suites/password/pwp_test.py::test_expiration_date 1.28
No log output captured.
Passed suites/password/pwp_test.py::test_passwordlockout 2.50
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_local_password_policy 0.12
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/password/regression_of_bugs_test.py::test_passwordexpirationtime_attribute 3.05
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_admin_group_to_modify_password 1.31
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_password_max_failure_should_lockout_password 0.15
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_pwd_update_time_attribute 3.25
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_password_track_update_time 7.43
No log output captured.
Passed suites/password/regression_of_bugs_test.py::test_signal_11 0.05
No log output captured.
Passed suites/password/regression_test.py::test_pwp_local_unlock 4.13
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.password.regression_test:regression_test.py:68 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to on INFO  tests.suites.password.regression_test:regression_test.py:75 Configure subtree password policy for ou=people,dc=example,dc=com INFO  tests.suites.password.regression_test:regression_test.py:98 Adding user-uid=UIDpwtest1,ou=people,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:137 Verify user can bind... INFO  tests.suites.password.regression_test:regression_test.py:140 Test passwordUnlock default - user should be able to reset password after lockout INFO  tests.suites.password.regression_test:regression_test.py:151 Verify account is locked INFO  tests.suites.password.regression_test:regression_test.py:155 Wait for lockout duration... INFO  tests.suites.password.regression_test:regression_test.py:158 Check if user can now bind with correct password
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1] 0.08
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[SNpwtest1] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with SNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[UIDpwtest1] 0.33
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with UIDpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[MAILpwtest1@redhat.com] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with MAILpwtest1@redhat.com
Passed suites/password/regression_test.py::test_trivial_passw_check[GNpwtest1] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with GNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZZZ] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZZZCNpwtest1] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[ZCNpwtest1] 0.31
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1Z] 0.08
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1Z
Passed suites/password/regression_test.py::test_trivial_passw_check[ZCNpwtest1Z] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZCNpwtest1Z
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZCNpwtest1] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZ] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZCNpwtest1ZZ] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZCNpwtest1ZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZCNpwtest1] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZCNpwtest1
Passed suites/password/regression_test.py::test_trivial_passw_check[CNpwtest1ZZZ] 0.08
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with CNpwtest1ZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZCNpwtest1ZZZ] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZCNpwtest1ZZZ
Passed suites/password/regression_test.py::test_trivial_passw_check[ZZZZZZCNpwtest1ZZZZZZZZ] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:183 Replace userPassword attribute with ZZZZZZCNpwtest1ZZZZZZZZ
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[SNpwtest1] 0.12
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with SNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[UIDpwtest1] 0.35
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with UIDpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[MAILpwtest1@redhat.com] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with MAILpwtest1@redhat.com
Passed suites/password/regression_test.py::test_global_vs_local[GNpwtest1] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with GNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZZZ] 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZZZCNpwtest1] 0.12
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[ZCNpwtest1] 0.16
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1Z] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1Z
Passed suites/password/regression_test.py::test_global_vs_local[ZCNpwtest1Z] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZCNpwtest1Z
Passed suites/password/regression_test.py::test_global_vs_local[ZZCNpwtest1] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZ] 0.12
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZCNpwtest1ZZ] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZCNpwtest1ZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZCNpwtest1] 0.18
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZCNpwtest1
Passed suites/password/regression_test.py::test_global_vs_local[CNpwtest1ZZZ] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with CNpwtest1ZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZCNpwtest1ZZZ] 0.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZCNpwtest1ZZZ
Passed suites/password/regression_test.py::test_global_vs_local[ZZZZZZCNpwtest1ZZZZZZZZ] 0.58
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:211 Configure Pwpolicy with PasswordCheckSyntax and nsslapd-pwpolicy-local set to off INFO  tests.suites.password.regression_test:regression_test.py:216 Replace userPassword attribute with ZZZZZZCNpwtest1ZZZZZZZZ
Passed suites/password/regression_test.py::test_unhashed_pw_switch 32.64
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.password.regression_test:regression_test.py:257 Enable plugins... INFO  tests.suites.password.regression_test:regression_test.py:272 create users and group... INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db']
Passed suites/plugins/acceptance_test.py::test_acctpolicy 15.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/acceptance_test.py::test_attruniq 17.70
No log output captured.
Passed suites/plugins/acceptance_test.py::test_automember 25.55
No log output captured.
Passed suites/plugins/acceptance_test.py::test_dna 17.62
No log output captured.
Passed suites/plugins/acceptance_test.py::test_linkedattrs 25.53
No log output captured.
Passed suites/plugins/acceptance_test.py::test_memberof 34.68
No log output captured.
Passed suites/plugins/acceptance_test.py::test_mep 18.40
No log output captured.
Passed suites/plugins/acceptance_test.py::test_passthru 22.37
No log output captured.
Passed suites/plugins/acceptance_test.py::test_referint 13.39
No log output captured.
Passed suites/plugins/acceptance_test.py::test_retrocl 22.75
No log output captured.
Passed suites/plugins/acceptance_test.py::test_rootdn 35.16
No log output captured.
Passed suites/plugins/accpol_test.py::test_glact_inact 27.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.utils:accpol_test.py:35 Configuring Global account policy plugin, pwpolicy attributes and restarting the server
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:348 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:351 Sleep for 10 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:353 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:356 Sleep for 3 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:360 Sleep +10 secs to check if account glinactusr3 is inactivated INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glremv_lastlogin 19.37
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:394 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:396 Sleep for 6 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:398 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:401 Sleep for 7 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:405 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glact_login 23.33
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:434 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:436 Sleep for 13 secs to check if account is inactivated, expected error 19 INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:440 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_limit 122.54
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:492 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:494 Sleep for 9 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:516 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnologin_attr 84.38
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:575 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:576 Set attribute StateAttrName to createTimestamp, loginTime attr wont be considered INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:580 Sleep for 9 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:605 Set attribute StateAttrName to lastLoginTime, the default INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:609 Check if account is activated, expected 0 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnoalt_stattr 56.61
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:640 Set attribute altStateAttrName to 1.1 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:644 Sleep for 13 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:647 lastLoginTime attribute is added from the above ldap bind by userdn INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glattr_modtime 45.12
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:687 Set attribute altStateAttrName to modifyTimestamp INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:691 Sleep for 13 secs to check if account is inactivated, expected 0 INFO  lib389.utils:accpol_test.py:202 Check ModifyTimeStamp attribute present for user INFO  lib389.utils:accpol_test.py:237 Enable account by replacing cn attribute value, value of modifyTimeStamp changed INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:254 Delete lastLoginTime/createTimeStamp/ModifyTimeStamp attribute from user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnoalt_nologin 51.56
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:742 Set attribute altStateAttrName to 1.1 INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:744 Set attribute alwaysrecordlogin to No INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:748 Sleep for 13 secs to check if account is not inactivated, expected 0 INFO  lib389.utils:accpol_test.py:753 Set attribute altStateAttrName to createTimestamp INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:758 Reset the default attribute values INFO  lib389.utils:accpol_test.py:189 Modify attribute value for a given DN INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_nsact 23.16
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:799 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:801 Sleep for 3 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using dsidm INFO  lib389.utils:accpol_test.py:166 Running unlock for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:174 Running ['/usr/sbin/dsidm', 'slapd-standalone1', '-b', 'dc=example,dc=com', 'account', 'unlock', 'uid=nsactusr1,ou=groups,dc=example,dc=com'] for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'Error: Account is already active\n' INFO  lib389.utils:accpol_test.py:804 Sleep for 10 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using dsidm INFO  lib389.utils:accpol_test.py:166 Running unlock for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:174 Running ['/usr/sbin/dsidm', 'slapd-standalone1', '-b', 'dc=example,dc=com', 'account', 'unlock', 'uid=nsactusr1,ou=groups,dc=example,dc=com'] for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'Error: 103 - 22 - 16 - No such attribute - []\n' INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using dsidm INFO  lib389.utils:accpol_test.py:166 Running entry-status for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:174 Running ['/usr/sbin/dsidm', 'slapd-standalone1', '-b', 'dc=example,dc=com', 'account', 'entry-status', 'uid=nsactusr1,ou=groups,dc=example,dc=com'] for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'Entry DN: uid=nsactusr1,ou=groups,dc=example,dc=com\nEntry Creation Date: 20201031022159Z (2020-10-31 02:21:59)\nEntry Modification Date: 20201031022159Z (2020-10-31 02:21:59)\nEntry Last Login Date: 20201031022159Z (2020-10-31 02:21:59)\nEntry Time Since Inactive: 5 seconds (2020-10-31 03:22:11)\nEntry State: inactivity limit exceeded\n\n' INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:162 Account activate/in-activate/status using dsidm INFO  lib389.utils:accpol_test.py:166 Running entry-status for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:174 Running ['/usr/sbin/dsidm', 'slapd-standalone1', '-b', 'dc=example,dc=com', 'account', 'entry-status', 'uid=nsactusr1,ou=groups,dc=example,dc=com'] for user uid=nsactusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:180 output: b'Entry DN: uid=nsactusr1,ou=groups,dc=example,dc=com\nEntry Creation Date: 20201031022159Z (2020-10-31 02:21:59)\nEntry Modification Date: 20201031022217Z (2020-10-31 02:22:17)\nEntry Last Login Date: 20201031022219Z (2020-10-31 02:22:19)\nEntry Time Until Inactive: 10 seconds (2020-10-31 03:22:31)\nEntry State: activated\n\n' INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glinact_acclock 40.38
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:844 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:846 Sleep for 3 secs and try invalid binds to lockout the user INFO  lib389.utils:accpol_test.py:118 Lockout user account by attempting invalid password binds INFO  lib389.utils:accpol_test.py:850 Sleep for 10 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:854 Add lastLoginTime to activate the user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:858 Checking if account is unlocked after passwordlockoutduration, but inactivated after accountInactivityLimit INFO  lib389.utils:accpol_test.py:118 Lockout user account by attempting invalid password binds INFO  lib389.utils:accpol_test.py:862 Account is expected to be unlocked after 5 secs of passwordlockoutduration INFO  lib389.utils:accpol_test.py:866 Sleep 13s and check if account inactivated based on accountInactivityLimit, expected 19 INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_glnact_pwexp 49.37
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:909 AccountInactivityLimit set to 12. Account will be inactivated if not accessed in 12 secs INFO  lib389.utils:accpol_test.py:910 Passwordmaxage is set to 9. Password will expire in 9 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:913 Sleep for 9 secs and check if password expired INFO  lib389.utils:accpol_test.py:919 Add lastLoginTime to activate the user account INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:930 Sleep for 4 secs and check if account is now inactivated, expected error 19 INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:141 Reset user password for user-uid=pwexpusr1,ou=groups,dc=example,dc=com INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locact_inact 31.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.utils:accpol_test.py:80 Adding Local account policy plugin configuration entries
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:981 AccountInactivityLimit set to 10. Account will be inactivated if not accessed in 10 secs INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:983 Sleep for 9 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:985 Account should not be inactivated since AccountInactivityLimit not exceeded INFO  lib389.utils:accpol_test.py:987 Sleep for 2 more secs to check if account is inactivated INFO  lib389.utils:accpol_test.py:990 Sleep +9 secs to check if account inactusr3 is inactivated INFO  lib389.utils:accpol_test.py:993 Add lastLoginTime attribute to all users and check if its activated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locinact_modrdn 27.24
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:1027 Account should not be inactivated since the subtree is not configured INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:1029 Sleep for 11 secs to check if account is not inactivated, expected value 0 INFO  lib389.utils:accpol_test.py:1032 Moving users from ou=groups to ou=people subtree INFO  lib389.utils:accpol_test.py:1040 Then wait for 11 secs and check if entries are inactivated INFO  lib389.utils:accpol_test.py:219 Enable account by replacing lastLoginTime/createTimeStamp/ModifyTimeStamp attribute INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/accpol_test.py::test_locact_modrdn 15.11
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:accpol_test.py:1072 Account should be inactivated since the subtree is configured INFO  lib389.utils:accpol_test.py:271 add_users: Pass all of these as parameters suffix, subtree, userid and nousrs INFO  lib389.utils:accpol_test.py:1074 Sleep for 11 secs to check if account is inactivated, expected value 19 INFO  lib389.utils:accpol_test.py:1077 Moving users from ou=people to ou=groups subtree INFO  lib389.utils:accpol_test.py:1084 Sleep for +2 secs and check users from both ou=people and ou=groups subtree INFO  lib389.utils:accpol_test.py:290 del_users: Pass all of these as parameters suffix, subtree, userid and nousrs
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_duplicate_values 3.60
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:31 Ticket 47431 - 0: Enable 7bit plugin...
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:56 Ticket 47431 - 1: Check 26 duplicate values are treated as one... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:59 modify_s cn=7-bit check,cn=plugins,cn=config DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:83 line: [30/Oct/2020:22:25:33.957278147 -0400] - WARN - str2entry_dupcheck - 26 duplicate values for attribute type nsslapd-pluginarg2 detected in entry cn=7-bit check,cn=plugins,cn=config. Extra values ignored. INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:84 Expected error "str2entry_dupcheck.* duplicate values for attribute type nsslapd-pluginarg2 detected in entry cn=7-bit check,cn=plugins,cn=config." logged in /var/log/dirsrv/slapd-standalone1/errors INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:86 Ticket 47431 - 1: done
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_multiple_value 6.76
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:111 Ticket 47431 - 2: Check two values belonging to one arg is fixed... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [30/Oct/2020:22:25:38.712800551 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 0: uid DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[0] uid DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 uid was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [30/Oct/2020:22:25:38.718518301 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 1: mail DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[1] mail DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 mail was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [30/Oct/2020:22:25:38.721904860 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 2: userpassword DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[2] userpassword DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 userpassword was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [30/Oct/2020:22:25:38.725017758 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 3: , DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[3] , DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 , was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:131 line - [30/Oct/2020:22:25:38.732593615 -0400] - DEBUG - NS7bitAttr - NS7bitAttr_Init - 4: dc=example,dc=com DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:132 ATTRS[4] dc=example,dc=com DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:136 dc=example,dc=com was logged INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:142 Ticket 47431 - 2: done
Passed suites/plugins/attr_nsslapd-pluginarg_test.py::test_missing_args 4.76
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:170 Ticket 47431 - 3: Check missing args are fixed... DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 uid was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 mail was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 userpassword was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 , was logged DEBUG  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:197 dc=example,dc=com was logged INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:203 Ticket 47431 - 3: done INFO  tests.suites.plugins.attr_nsslapd-pluginarg_test:attr_nsslapd-pluginarg_test.py:204 Test complete
Passed suites/plugins/cos_test.py::test_cos_operational_default 4.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL lib389.utils:cos_test.py:27 Adding user (uid=user_0,ou=people,dc=example,dc=com): INFO  lib389.utils:cos_test.py:153 Returned telephonenumber (exp. real): b'1234 is real' INFO  lib389.utils:cos_test.py:154 Returned telephonenumber: 8 INFO  lib389.utils:cos_test.py:160 Returned l (exp. real): b'here is real' INFO  lib389.utils:cos_test.py:161 Returned l: 8 INFO  lib389.utils:cos_test.py:170 Returned seealso (exp. virtual): b'dc=virtual,dc=example,dc=com' INFO  lib389.utils:cos_test.py:171 Returned seealso: 3 INFO  lib389.utils:cos_test.py:180 Returned description (exp. virtual): b'desc is virtual' INFO  lib389.utils:cos_test.py:181 Returned description: 8 INFO  lib389.utils:cos_test.py:191 Returned title (exp. real): b'title is real' INFO  lib389.utils:cos_test.py:212 Returned title(exp. virt): b'title is virtual 1' INFO  lib389.utils:cos_test.py:212 Returned title(exp. virt): b'title is virtual 0'
Passed suites/plugins/deref_aci_test.py::test_deref_and_access_control 0.65
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.deref_aci_test:deref_aci_test.py:133 Check, that the dereference search result does not have userpassword
Passed suites/plugins/dna_test.py::test_dnatype_only_valid 4.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/entryusn_test.py::test_entryusn_no_duplicates 6.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/entryusn_test.py::test_entryusn_is_same_after_failure 6.14
No log output captured.
Passed suites/plugins/entryusn_test.py::test_entryusn_after_repl_delete 4.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e1f0ee32-d68b-4335-9987-f81af8914ea8 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect fb572479-a84a-4947-a571-e3f66fd3817f / got description=e1f0ee32-d68b-4335-9987-f81af8914ea8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
Passed suites/plugins/managed_entry_test.py::test_binddn_tracking 2.13
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/managed_entry_test.py::test_mentry01 10.04
No log output captured.
Passed suites/plugins/managed_entry_test.py::test_managed_entry_removal 4.35
No log output captured.
Passed suites/plugins/memberof_test.py::test_betxnpostoperation_replace 4.80
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/plugins/memberof_test.py::test_memberofgroupattr_add 0.01
No log output captured.
Passed suites/plugins/memberof_test.py::test_enable 4.59
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:181 Enable MemberOf plugin
Passed suites/plugins/memberof_test.py::test_member_add 0.37
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofenh1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofenh2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:211 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:212 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:215 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:216 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_delete_gr1 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:246 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_delete_gr2 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:278 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_member_delete_all 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:310 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:314 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_member_after_restart 8.45
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:349 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:353 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:364 Remove uniqueMember as a memberofgrpattr INFO  tests.suites.plugins.memberof_test:memberof_test.py:371 Assert that this change of configuration did change the already set values INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_memberofgroupattr_uid 0.00
-------------------------------Captured log call--------------------------------
ERROR  tests.suites.plugins.memberof_test:memberof_test.py:400 Setting 'memberUid' as memberofgroupattr is rejected (expected)
Passed suites/plugins/memberof_test.py::test_member_add_duplicate_usr1 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:422 Try b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' (member) ERROR  tests.suites.plugins.memberof_test:memberof_test.py:429 b'uid=user_memofenh1,ou=people,dc=example,dc=com' already member of b'cn=group_memofegrp1,ou=groups,dc=example,dc=com' --> fail (expected)
Passed suites/plugins/memberof_test.py::test_member_add_duplicate_usr2 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:450 Check initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:460 Try b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' (member) ERROR  tests.suites.plugins.memberof_test:memberof_test.py:467 b'uid=user_memofenh2,ou=people,dc=example,dc=com' already member of b'cn=group_memofegrp2,ou=groups,dc=example,dc=com' --> fail (expected) INFO  tests.suites.plugins.memberof_test:memberof_test.py:470 Check final status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_uniquemember_same_user 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:557 Check initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:569 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:570 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:582 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is not memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:586 Update b'uid=user_memofenh2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp3,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:613 Checking final status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_member_not_exists 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:671 Checking Initial status INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp015,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:691 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp015,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:692 Update b'uid=user_dummy2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp015,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_member_not_exists_complex 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:806 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:807 Update b'uid=user_memofenh1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:835 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:845 Update b'uid=user_dummy1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp016,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_1 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:1011 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1012 Update b'uid=user_memofuser2,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1013 Update b'uid=user_memofuser3,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (memberuid) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_2 0.35
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:1261 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1262 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1263 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is memberof b'cn=group_memofegrp017,ou=groups,dc=example,dc=com' (memberuid) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp018,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:1283 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp018,ou=groups,dc=example,dc=com' (member) INFO  tests.suites.plugins.memberof_test:memberof_test.py:1284 Update b'uid=user_memofuser1,ou=people,dc=example,dc=com' is no longer memberof b'cn=group_memofegrp018,ou=groups,dc=example,dc=com' (uniqueMember) INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp017,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_3 0.24
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp019_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp019_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_4 0.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_5 0.76
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:67 Create user uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:83 Create group cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_6 5.25
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_7 0.49
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp016,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofenh2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_8 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified
Passed suites/plugins/memberof_test.py::test_complex_group_scenario_9 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser2,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser3,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser4,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 uniqueMember: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: uid=user_memofuser1,ou=people,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_1,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_2,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_3,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:101 member: cn=group_memofegrp020_4,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'cn=group_memofegrp020_5,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.uniqueMember = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:94 Check b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com'.member = b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_1,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_2,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_3,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'cn=group_memofegrp020_4,ou=groups,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser1,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group_memofegrp020_5,ou=groups,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser2,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser3,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com' INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from b'uid=user_memofuser4,ou=people,dc=example,dc=com'
Passed suites/plugins/memberof_test.py::test_memberof_auto_add_oc 0.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.memberof_test:memberof_test.py:109 Lookup memberof from uid=user1,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:116 memberof: cn=group,dc=example,dc=com INFO  tests.suites.plugins.memberof_test:memberof_test.py:119 --> membership verified INFO  tests.suites.plugins.memberof_test:memberof_test.py:2753 Correctly rejected invalid objectclass INFO  tests.suites.plugins.memberof_test:memberof_test.py:2823 Test complete.
Passed suites/plugins/pluginpath_validation_test.py::test_pluginpath_validation 0.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.pluginpath_validation_test:pluginpath_validation_test.py:103 Test complete
Passed suites/plugins/referint_test.py::test_referential_false_failure 7.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo0,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo1,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo2,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo3,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo4,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo5,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo6,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo7,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo8,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo9,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo10,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo11,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo12,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo13,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo14,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo15,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo16,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo17,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo18,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo19,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo20,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo21,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo22,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo23,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo24,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo25,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo26,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo27,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo28,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo29,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo30,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo31,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo32,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo33,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo34,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo35,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo36,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo37,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo38,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo39,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo40,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo41,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo42,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo43,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo44,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo45,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo46,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo47,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo48,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo49,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo50,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo51,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo52,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo53,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo54,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo55,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo56,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo57,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo58,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo59,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo60,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo61,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo62,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo63,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo64,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo65,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo66,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo67,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo68,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo69,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo70,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo71,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo72,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo73,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo74,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo75,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo76,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo77,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo78,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo79,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo80,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo81,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo82,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo83,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo84,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo85,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo86,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo87,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo88,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo89,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo90,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo91,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo92,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo93,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo94,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo95,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo96,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo97,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo98,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo99,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo100,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo101,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo102,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo103,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo104,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo105,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo106,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo107,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo108,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo109,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo110,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo111,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo112,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo113,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo114,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo115,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo116,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo117,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo118,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo119,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo120,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo121,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo122,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo123,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo124,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo125,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo126,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo127,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo128,dc=example,dc=com): CRITICAL tests.suites.plugins.referint_test:referint_test.py:37 Adding user (uid=foo\,oo129,dc=example,dc=com):
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_specific_time 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:60 Initializing root DN test suite... INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:70 test_rootdn_init: Initialized root DN test suite. INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:102 Running test_rootdn_access_specific_time...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_day_of_week 2.58
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:185 Running test_rootdn_access_day_of_week... INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:202 Today: Sat INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:203 Allowed days: Sat,Sun INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:204 Deny days: Mon, Tue
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_ip 2.58
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:268 Running test_rootdn_access_denied_ip...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_host 2.62
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:333 Running test_rootdn_access_denied_host...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_ip 2.58
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:400 Running test_rootdn_access_allowed_ip...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_host 2.63
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:466 Running test_rootdn_access_allowed_host...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_config_validate 0.03
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:575 Add just "rootdn-open-time"
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_denied_ip_wildcard 2.59
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:666 Running test_rootdn_access_denied_ip_wildcard...
Passed suites/plugins/rootdn_plugin_test.py::test_rootdn_access_allowed_ip_wildcard 3.12
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:36 Cleaning up the config area
-------------------------------Captured log call--------------------------------
INFO  tests.suites.plugins.rootdn_plugin_test:rootdn_plugin_test.py:712 Running test_rootdn_access_allowed_ip...
Passed suites/psearch/psearch_test.py::test_psearch 2.02
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:psearch_test.py:30 dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=people,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 ou=services,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 uid=demo_user,ou=people,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=demo_group,ou=groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=group_admin,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=group_modify,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=user_admin,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=user_modify,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=user_passwd_reset,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:30 cn=user_private_read,ou=permissions,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:34 No more results INFO  lib389:psearch_test.py:30 cn=group1,ou=groups,dc=example,dc=com has changed! INFO  lib389:psearch_test.py:34 No more results
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[CRYPT] 0.55
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA] 0.28
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA256] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA256] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA384] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA384] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SHA512] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[SSHA512] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[MD5] 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_password_scheme[PBKDF2_SHA256] 0.07
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_clear_scheme 0.04
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_two_scheme 4.50
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_pbkdf2_sha256 4.58
No log output captured.
Passed suites/pwp_storage/storage_test.py::test_check_ssha512 6.03
No log output captured.
Passed suites/referint_plugin/rename_test.py::test_rename_large_subtree 177.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 714b6763-9df2-4013-9ed0-ab31504ba20b / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect a026fdc4-a5ef-426c-afac-deda499098ef / got description=714b6763-9df2-4013-9ed0-ab31504ba20b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b7c57fbe-b04a-4c28-9948-d8a0645f266a / got description=a026fdc4-a5ef-426c-afac-deda499098ef) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c70ec3e8-a1a1-4d22-8195-5760d1df4970 / got description=b7c57fbe-b04a-4c28-9948-d8a0645f266a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/acceptance_test.py::test_add_entry 10.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f2d9fac0-fbc9-4dbc-838d-7a25a63c7018 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3032f58f-531c-4c61-8731-88d6b63114ce / got description=f2d9fac0-fbc9-4dbc-838d-7a25a63c7018) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 63db55a9-a390-481d-b16f-19d646b50ad4 / got description=3032f58f-531c-4c61-8731-88d6b63114ce) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b7ab53d2-a020-4905-81e4-8b780a6e793b / got description=63db55a9-a390-481d-b16f-19d646b50ad4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b7ab53d2-a020-4905-81e4-8b780a6e793b / got description=63db55a9-a390-481d-b16f-19d646b50ad4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect d5f55fc1-6d89-4200-b0da-fca7d7927af5 / got description=b7ab53d2-a020-4905-81e4-8b780a6e793b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 74b7d594-98f9-4509-a907-5b89f4a4fa39 / got description=d5f55fc1-6d89-4200-b0da-fca7d7927af5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com
Passed suites/replication/acceptance_test.py::test_modify_entry 3.07
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:45 Deleting entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:128 Modifying entry uid=mmrepl_test,dc=example,dc=com - add operation INFO  tests.suites.replication.acceptance_test:acceptance_test.py:138 Modifying entry uid=mmrepl_test,dc=example,dc=com - replace operation INFO  tests.suites.replication.acceptance_test:acceptance_test.py:146 Modifying entry uid=mmrepl_test,dc=example,dc=com - delete operation
Passed suites/replication/acceptance_test.py::test_delete_entry 10.02
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:45 Deleting entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:168 Deleting entry uid=mmrepl_test,dc=example,dc=com during the test
Passed suites/replication/acceptance_test.py::test_modrdn_entry[0] 20.06
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:192 Modify entry RDN uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:211 Remove entry with new RDN uid=newrdn,dc=example,dc=com
Passed suites/replication/acceptance_test.py::test_modrdn_entry[1] 20.07
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:192 Modify entry RDN uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:211 Remove entry with new RDN uid=newrdn,dc=example,dc=com
Passed suites/replication/acceptance_test.py::test_modrdn_after_pause 14.57
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:237 Adding entry uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:252 Pause all replicas INFO  tests.suites.replication.acceptance_test:acceptance_test.py:255 Modify entry RDN uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:263 Resume all replicas INFO  tests.suites.replication.acceptance_test:acceptance_test.py:266 Wait for replication to happen INFO  tests.suites.replication.acceptance_test:acceptance_test.py:273 Remove entry with new RDN uid=newrdn,dc=example,dc=com
Passed suites/replication/acceptance_test.py::test_modify_stripattrs 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:295 Modify nsds5replicastripattrs with b'modifiersname modifytimestamp' INFO  tests.suites.replication.acceptance_test:acceptance_test.py:298 Check nsds5replicastripattrs for b'modifiersname modifytimestamp'
Passed suites/replication/acceptance_test.py::test_new_suffix 11.04
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:62 Adding suffix:o=test_repl and backend: repl_base to master1 INFO  lib389:backend.py:80 List backend with suffix=o=test_repl INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=repl_base,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=repl_base,cn=ldbm database,cn=plugins,cn=config cn: repl_base nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-master1/db/repl_base nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=test_repl objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=test_repl",cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dtest_repl,cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.suites.replication.acceptance_test:acceptance_test.py:62 Adding suffix:o=test_repl and backend: repl_base to master2 INFO  lib389:backend.py:80 List backend with suffix=o=test_repl INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=repl_base,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=repl_base,cn=ldbm database,cn=plugins,cn=config cn: repl_base nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-master2/db/repl_base nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=test_repl objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=test_repl",cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dtest_repl,cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.suites.replication.acceptance_test:acceptance_test.py:62 Adding suffix:o=test_repl and backend: repl_base to master3 INFO  lib389:backend.py:80 List backend with suffix=o=test_repl INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=repl_base,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=repl_base,cn=ldbm database,cn=plugins,cn=config cn: repl_base nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-master3/db/repl_base nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=test_repl objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=test_repl",cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dtest_repl,cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.suites.replication.acceptance_test:acceptance_test.py:62 Adding suffix:o=test_repl and backend: repl_base to master4 INFO  lib389:backend.py:80 List backend with suffix=o=test_repl INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=repl_base,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=repl_base,cn=ldbm database,cn=plugins,cn=config cn: repl_base nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-master4/db/repl_base nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=test_repl objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=test_repl",cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dtest_repl,cn=mapping tree,cn=config cn: o=test_repl nsslapd-backend: repl_base nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4fcdb833-a197-4f8f-aec7-950579b72004 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7e332e6b-6108-4604-9682-c9933b4018e1 / got description=4fcdb833-a197-4f8f-aec7-950579b72004) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 40e2506a-660f-4e67-b5c3-a057e8c8a28a / got description=7e332e6b-6108-4604-9682-c9933b4018e1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect f4749723-3648-4945-a567-ef0c627697df / got description=40e2506a-660f-4e67-b5c3-a057e8c8a28a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/acceptance_test.py::test_many_attrs 20.23
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:353 Modifying entry uid=mmrepl_test,dc=example,dc=com - 10 add operations INFO  tests.suites.replication.acceptance_test:acceptance_test.py:357 Check that everything was properly replicated after an add operation INFO  tests.suites.replication.acceptance_test:acceptance_test.py:362 Modifying entry uid=mmrepl_test,dc=example,dc=com - 4 delete operations for [b'test0', b'test4', b'test7', b'test9'] INFO  tests.suites.replication.acceptance_test:acceptance_test.py:366 Check that everything was properly replicated after a delete operation
Passed suites/replication/acceptance_test.py::test_double_delete 10.03
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com INFO  tests.suites.replication.acceptance_test:acceptance_test.py:45 Deleting entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:386 Deleting entry uid=mmrepl_test,dc=example,dc=com from master1 INFO  tests.suites.replication.acceptance_test:acceptance_test.py:389 Deleting entry uid=mmrepl_test,dc=example,dc=com from master2 INFO  tests.suites.replication.acceptance_test:acceptance_test.py:393 Entry uid=mmrepl_test,dc=example,dc=com wasn't found master2. It is expected. INFO  tests.suites.replication.acceptance_test:acceptance_test.py:395 Make searches to check if server is alive
Passed suites/replication/acceptance_test.py::test_password_repl_error 20.28
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:41 Adding entry uid=mmrepl_test,dc=example,dc=com
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:416 Clean the error log INFO  tests.suites.replication.acceptance_test:acceptance_test.py:419 Set replication loglevel INFO  tests.suites.replication.acceptance_test:acceptance_test.py:422 Modifying entry uid=mmrepl_test,dc=example,dc=com - change userpassword on master 2 INFO  tests.suites.replication.acceptance_test:acceptance_test.py:430 Restart the servers to flush the logs INFO  tests.suites.replication.acceptance_test:acceptance_test.py:439 Check the error log for the error with uid=mmrepl_test,dc=example,dc=com
Passed suites/replication/acceptance_test.py::test_invalid_agmt 0.06
-------------------------------Captured log call--------------------------------
CRITICAL lib389:agreement.py:1026 Failed to add replication agreement: {'msgtype': 105, 'msgid': 5, 'result': 53, 'desc': 'Server is unwilling to perform', 'ctrls': []} INFO  lib389:acceptance_test.py:464 Invalid repl agreement correctly rejected
Passed suites/replication/acceptance_test.py::test_warining_for_invalid_replica 0.27
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:494 Set nsds5ReplicaBackoffMin to 20 INFO  tests.suites.replication.acceptance_test:acceptance_test.py:497 Set nsds5ReplicaBackoffMax to 10 INFO  tests.suites.replication.acceptance_test:acceptance_test.py:499 Resetting configuration: nsds5ReplicaBackoffMin INFO  tests.suites.replication.acceptance_test:acceptance_test.py:501 Check the error log for the error
Passed suites/replication/acceptance_test.py::test_csngen_task 11.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9d15a510-bafb-4dbf-b9fa-793ec2c8e63b / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 5196b824-862d-4f1f-bded-ed8e48f6b263 / got description=9d15a510-bafb-4dbf-b9fa-793ec2c8e63b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:523 Check the error log contains strings showing csn generator is tested
Passed suites/replication/acceptance_test.py::test_csnpurge_large_valueset 6.44
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:574 nsds5ReplicaPurgeDelay to 5
Passed suites/replication/acceptance_test.py::test_urp_trigger_substring_search 608.64
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.acceptance_test:acceptance_test.py:604 Set nsslapd-plugin-logging to on INFO  tests.suites.replication.acceptance_test:acceptance_test.py:634 Entry not yet replicated on M2, wait a bit INFO  tests.suites.replication.acceptance_test:acceptance_test.py:638 Check that on M2, URP as not triggered such internal search INFO  tests.suites.replication.acceptance_test:acceptance_test.py:641 found line: []
Passed suites/replication/cascading_test.py::test_basic_with_hub 15.46
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for hub1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:524 Creating replication topology. INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect ddfc1c20-a500-4dea-9593-94354c3e3a17 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is working INFO  lib389.replica:replica.py:2211 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect bd232761-5173-4703-aed4-84978945723e / got description=ddfc1c20-a500-4dea-9593-94354c3e3a17) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.cascading_test:cascading_test.py:45 update cn=101,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal INFO  tests.suites.replication.cascading_test:cascading_test.py:45 update cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config to add nsDS5ReplicatedAttributeListTotal INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 0d11eb0b-57e9-4784-8b2b-28590fefa96d / got description=bd232761-5173-4703-aed4-84978945723e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect a10b427d-9ff2-4c6c-b788-20ff2df47231 / got description=0d11eb0b-57e9-4784-8b2b-28590fefa96d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 7529e5b1-38bd-402c-8724-43c0dff389f7 / got description=a10b427d-9ff2-4c6c-b788-20ff2df47231) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
Passed suites/replication/changelog_encryption_test.py::test_cl_encryption_setup_process 30.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4382fc26-a2d1-400e-ae79-bfd2651be1c1 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_encryption_test:changelog_encryption_test.py:43 Enable TLS ... INFO  tests.suites.replication.changelog_encryption_test:changelog_encryption_test.py:48 Export changelog ... INFO  tests.suites.replication.changelog_encryption_test:changelog_encryption_test.py:55 Enable changelog encryption ... INFO  tests.suites.replication.changelog_encryption_test:changelog_encryption_test.py:68 Import changelog ... INFO  tests.suites.replication.changelog_encryption_test:changelog_encryption_test.py:73 Test replication is still working ...
Passed suites/replication/changelog_test.py::test_dsconf_dump_changelog_files_removed 6.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b3cd7d62-6658-4e98-b119-421c5caf6da4 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 25370f7a-9d70-489c-960d-d1a36336caad / got description=b3cd7d62-6658-4e98-b119-421c5caf6da4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:222 Remove .ldif files, if present in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:222 Remove .done files, if present in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:52 Adding user replusr INFO  tests.suites.replication.changelog_test:changelog_test.py:65 Modify RDN of user uid=replusr,ou=People,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:72 Deleting user: uid=cl5usr,ou=people,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:386 Use dsconf export-changelog with invalid parameters INFO  tests.suites.replication.changelog_test:changelog_test.py:388 Command used : ['/usr/sbin/dsconf', 'ldap://LOCALHOST:39001', '-D', 'cn=Directory Manager', '-w', 'badpasswd', 'replication', 'export-changelog'] INFO  tests.suites.replication.changelog_test:changelog_test.py:391 output message : b'No action provided, here is some --help.\nusage: dsconf [-h] [-v] [-D BINDDN] [-w BINDPW] [-W] [-y PWDFILE] [-b BASEDN]\n [-Z] [-j]\n instance\n {backend,backup,chaining,config,directory_manager,monitor,plugin,pwpolicy,localpwp,replication,repl-agmt,repl-winsync-agmt,repl-tasks,sasl,security,schema,repl-conflict}\n ...\n\npositional arguments:\n instance The instance name OR the LDAP url to connect to, IE\n localhost, ldap://mai.example.com:389\n {backend,backup,chaining,config,directory_manager,monitor,plugin,pwpolicy,localpwp,replication,repl-agmt,repl-winsync-agmt,repl-tasks,sasl,security,schema,repl-conflict}\n resources to act upon\n backend Manage database suffixes and backends\n backup Manage online backups\n chaining Manage database chaining/database links\n config Manage server configuration\n directory_manager Manage the directory manager account\n monitor Monitor the state of the instance\n plugin Manage plugins available on the server\n pwpolicy Get and set the global password policy settings\n localpwp Manage local (user/subtree) password policies\n replication Configure replication for a suffix\n repl-agmt Manage replication agreements\n repl-winsync-agmt Manage Winsync Agreements\n repl-tasks Manage replication tasks\n sasl Query and manipulate SASL mappings\n security Query and manipulate security options\n schema Query and manipulate schema\n repl-conflict Manage replication conflicts\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --verbose Display verbose operation tracing during command\n execution\n -D BINDDN, --binddn BINDDN\n The account to bind as for executing operations\n -w BINDPW, --bindpw BINDPW\n Password for binddn\n -W, --prompt Prompt for password for the bind DN\n -y PWDFILE, --pwdfile PWDFILE\n Specifies a file containing the password for the\n binddn\n -b BASEDN, --basedn BASEDN\n Basedn (root naming context) of the instance to manage\n -Z, --starttls Connect with StartTLS\n -j, --json Return result in JSON object\n' INFO  tests.suites.replication.changelog_test:changelog_test.py:396 Use dsconf replication changelog without -l option: no generated ldif files should be present in /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:399 Command used : ['/usr/sbin/dsconf', 'ldap://LOCALHOST:39001', '-D', 'cn=Directory Manager', '-w', 'password', 'replication', 'export-changelog', 'default', '-r', 'dc=example,dc=com'] INFO  tests.suites.replication.changelog_test:changelog_test.py:404 Wait for all dsconf export-changelog files to be generated INFO  tests.suites.replication.changelog_test:changelog_test.py:407 Check if dsconf export-changelog generated .ldif.done files are present - should not INFO  tests.suites.replication.changelog_test:changelog_test.py:413 All dsconf export-changelog generated .ldif files have been successfully removed from /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:416 Use dsconf replication changelog with -l option: generated ldif files should be kept in /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:419 Command used : ['/usr/sbin/dsconf', 'ldap://LOCALHOST:39001', '-D', 'cn=Directory Manager', '-w', 'password', 'replication', 'export-changelog', 'to-ldif', '-o', '/var/lib/dirsrv/slapd-master1/ldif/test.ldif', '-r', 'dc=example,dc=com', '-l'] INFO  tests.suites.replication.changelog_test:changelog_test.py:424 Wait for all dsconf export-changelog files to be generated INFO  tests.suites.replication.changelog_test:changelog_test.py:427 Check if dsconf export-changelog generated .ldif.done files are present - should be INFO  tests.suites.replication.changelog_test:changelog_test.py:431 Success : ldif file /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif.done is present
Passed suites/replication/changelog_test.py::test_verify_changelog 0.14
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:454 LDAP operations add, modify, modrdn and delete INFO  tests.suites.replication.changelog_test:changelog_test.py:52 Adding user replusr INFO  tests.suites.replication.changelog_test:changelog_test.py:65 Modify RDN of user uid=replusr,ou=People,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:72 Deleting user: uid=cl5usr,ou=people,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:79 Dump changelog using nss5task and check if ldap operations are logged INFO  tests.suites.replication.changelog_test:changelog_test.py:88 Remove ldif files, if present in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:99 No existing changelog ldif files present INFO  tests.suites.replication.changelog_test:changelog_test.py:101 Running nsds5task to dump changelog database to a file INFO  tests.suites.replication.changelog_test:changelog_test.py:104 Check if changelog ldif file exist in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:108 Changelog ldif file exist: /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:118 Checking changelog ldif file for ldap operations INFO  tests.suites.replication.changelog_test:changelog_test.py:123 Checking if all required changetype operations are present INFO  tests.suites.replication.changelog_test:changelog_test.py:129 Valid ldap operations: {'delete', 'add', 'modify', 'modrdn'} INFO  tests.suites.replication.changelog_test:changelog_test.py:130 Ldap operations found: {'modify', 'delete', 'add', 'modrdn'}
Passed suites/replication/changelog_test.py::test_verify_changelog_online_backup 6.21
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:481 Run db2bak script to take database backup INFO  lib389:tasks.py:619 Backup task backup_10302020_225134 completed successfully INFO  tests.suites.replication.changelog_test:changelog_test.py:493 Database backup is created successfully INFO  tests.suites.replication.changelog_test:changelog_test.py:498 Run bak2db to restore directory server INFO  lib389:tasks.py:673 Restore task restore_10302020_225136 completed successfully INFO  tests.suites.replication.changelog_test:changelog_test.py:505 LDAP operations add, modify, modrdn and delete INFO  tests.suites.replication.changelog_test:changelog_test.py:52 Adding user replusr INFO  tests.suites.replication.changelog_test:changelog_test.py:65 Modify RDN of user uid=replusr,ou=People,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:72 Deleting user: uid=cl5usr,ou=people,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:79 Dump changelog using nss5task and check if ldap operations are logged INFO  tests.suites.replication.changelog_test:changelog_test.py:88 Remove ldif files, if present in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:97 Existing changelog ldif file: /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif removed INFO  tests.suites.replication.changelog_test:changelog_test.py:99 No existing changelog ldif files present INFO  tests.suites.replication.changelog_test:changelog_test.py:101 Running nsds5task to dump changelog database to a file INFO  tests.suites.replication.changelog_test:changelog_test.py:104 Check if changelog ldif file exist in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:108 Changelog ldif file exist: /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:118 Checking changelog ldif file for ldap operations INFO  tests.suites.replication.changelog_test:changelog_test.py:123 Checking if all required changetype operations are present INFO  tests.suites.replication.changelog_test:changelog_test.py:129 Valid ldap operations: {'delete', 'add', 'modify', 'modrdn'} INFO  tests.suites.replication.changelog_test:changelog_test.py:130 Ldap operations found: {'modify', 'delete', 'add', 'modrdn'}
Passed suites/replication/changelog_test.py::test_verify_changelog_offline_backup 6.33
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:534 Run db2bak to take database backup INFO  tests.suites.replication.changelog_test:changelog_test.py:541 Run bak2db to restore directory server INFO  tests.suites.replication.changelog_test:changelog_test.py:554 Database backup is created successfully INFO  tests.suites.replication.changelog_test:changelog_test.py:559 LDAP operations add, modify, modrdn and delete INFO  tests.suites.replication.changelog_test:changelog_test.py:52 Adding user replusr INFO  tests.suites.replication.changelog_test:changelog_test.py:65 Modify RDN of user uid=replusr,ou=People,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:72 Deleting user: uid=cl5usr,ou=people,dc=example,dc=com INFO  tests.suites.replication.changelog_test:changelog_test.py:79 Dump changelog using nss5task and check if ldap operations are logged INFO  tests.suites.replication.changelog_test:changelog_test.py:88 Remove ldif files, if present in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:97 Existing changelog ldif file: /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif removed INFO  tests.suites.replication.changelog_test:changelog_test.py:99 No existing changelog ldif files present INFO  tests.suites.replication.changelog_test:changelog_test.py:101 Running nsds5task to dump changelog database to a file INFO  tests.suites.replication.changelog_test:changelog_test.py:104 Check if changelog ldif file exist in: /var/lib/dirsrv/slapd-master1/ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:108 Changelog ldif file exist: /var/lib/dirsrv/slapd-master1/ldif/ed103604-1b2311eb-8910d416-4be7ff2c_cl.ldif INFO  tests.suites.replication.changelog_test:changelog_test.py:118 Checking changelog ldif file for ldap operations INFO  tests.suites.replication.changelog_test:changelog_test.py:123 Checking if all required changetype operations are present INFO  tests.suites.replication.changelog_test:changelog_test.py:129 Valid ldap operations: {'delete', 'add', 'modify', 'modrdn'} INFO  tests.suites.replication.changelog_test:changelog_test.py:130 Ldap operations found: {'modify', 'delete', 'add', 'modrdn'}
Passed suites/replication/changelog_test.py::test_changelog_maxage 1.10
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:144 Testing Ticket 47669 - Test duration syntax in the changelogs INFO  lib389:changelog_test.py:147 Bind as cn=Directory Manager
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:581 1. Test nsslapd-changelogmaxage in cn=changelog5,cn=config INFO  lib389:changelog_test.py:584 Bind as cn=Directory Manager INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 12345 -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 10s -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 30M -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 12h -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 2D -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 4w -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogmaxage: -123 -- invalid ERROR  tests.suites.replication.changelog_test:changelog_test.py:186 Expectedly failed to add nsslapd-changelogmaxage: -123 to cn=changelog,cn=userRoot,cn=ldbm database,cn=plugins,cn=config: error Server is unwilling to perform INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogmaxage: xyz -- invalid ERROR  tests.suites.replication.changelog_test:changelog_test.py:186 Expectedly failed to add nsslapd-changelogmaxage: xyz to cn=changelog,cn=userRoot,cn=ldbm database,cn=plugins,cn=config: error Server is unwilling to perform
Passed suites/replication/changelog_test.py::test_ticket47669_changelog_triminterval 0.83
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:614 2. Test nsslapd-changelogtrim-interval in cn=changelog5,cn=config INFO  lib389:changelog_test.py:617 Bind as cn=Directory Manager INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 12345 -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 10s -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 30M -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 12h -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 2D -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogtrim-interval: 4w -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogtrim-interval: -123 -- invalid ERROR  tests.suites.replication.changelog_test:changelog_test.py:186 Expectedly failed to add nsslapd-changelogtrim-interval: -123 to cn=changelog,cn=userRoot,cn=ldbm database,cn=plugins,cn=config: error Server is unwilling to perform INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogtrim-interval: xyz -- invalid ERROR  tests.suites.replication.changelog_test:changelog_test.py:186 Expectedly failed to add nsslapd-changelogtrim-interval: xyz to cn=changelog,cn=userRoot,cn=ldbm database,cn=plugins,cn=config: error Server is unwilling to perform
Passed suites/replication/changelog_test.py::test_retrochangelog_maxage 0.86
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:683 4. Test nsslapd-changelogmaxage in cn=Retro Changelog Plugin,cn=plugins,cn=config INFO  lib389:changelog_test.py:686 Bind as cn=Directory Manager INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 12345 -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 10s -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 30M -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 12h -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 2D -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:174 Test nsslapd-changelogmaxage: 4w -- valid INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogmaxage: -123 -- invalid INFO  tests.suites.replication.changelog_test:changelog_test.py:181 Test nsslapd-changelogmaxage: xyz -- invalid INFO  lib389:changelog_test.py:698 ticket47669 was successfully verified.
Passed suites/replication/changelog_test.py::test_retrochangelog_trimming_crash 23.21
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_test:changelog_test.py:722 1. Test retroCL trimming crash in cn=Retro Changelog Plugin,cn=plugins,cn=config INFO  lib389:changelog_test.py:726 ticket50736 start verification INFO  lib389:changelog_test.py:742 ticket 50736 was successfully verified.
Passed suites/replication/changelog_trimming_test.py::test_max_age 14.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_trimming_test:changelog_trimming_test.py:90 Testing changelog triming interval with max age...
Passed suites/replication/changelog_trimming_test.py::test_max_entries 11.73
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.changelog_trimming_test:changelog_trimming_test.py:133 Testing changelog triming interval with max entries...
Passed suites/replication/cleanallruv_max_tasks_test.py::test_max_tasks 51.22
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c28d6632-86c0-4d54-b587-34855e958a00 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8fd1a02c-03eb-49fd-aeed-0bf9eadf82fb / got description=c28d6632-86c0-4d54-b587-34855e958a00) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 22ac6808-4910-4edf-8746-c1ae5796f76b / got description=8fd1a02c-03eb-49fd-aeed-0bf9eadf82fb) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect e29bc60e-fb90-42a6-a9a8-ea6e24eac7d4 / got description=22ac6808-4910-4edf-8746-c1ae5796f76b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2b559737-5507-4470-8ce0-d8cb768f20a2 / got description=d89a2a43-02d0-44d4-a526-3514966c2e5d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created
Passed suites/replication/cleanallruv_test.py::test_clean 2.93
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c5484f8c-4853-45cb-9d5d-84eb871ffbd8 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect c0439ba5-990b-42d2-b1e7-ae18b011378b / got description=c5484f8c-4853-45cb-9d5d-84eb871ffbd8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect d80e9ac0-5951-421e-bf49-acad5453e2a8 / got description=c0439ba5-990b-42d2-b1e7-ae18b011378b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8f6f1f20-5a25-41ed-bc26-65f7ca7a106b / got description=d80e9ac0-5951-421e-bf49-acad5453e2a8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 018ffdbf-62f3-4c7a-9007-6feebba30080 / got description=8f6f1f20-5a25-41ed-bc26-65f7ca7a106b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3e6bef51-fc56-472f-99b0-7f6c8e0bbf36 / got description=018ffdbf-62f3-4c7a-9007-6feebba30080) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 93a8500c-0cb4-4eae-af87-2c4c36663a58 / got description=3e6bef51-fc56-472f-99b0-7f6c8e0bbf36) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 49ba0cf5-ceef-456e-a7f4-339e42f40334 / got description=93a8500c-0cb4-4eae-af87-2c4c36663a58) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 6a68d8cc-4338-40a5-a8f7-b8d1860d6535 / got description=49ba0cf5-ceef-456e-a7f4-339e42f40334) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect f00058aa-656f-4cb3-abfe-658c6e14503d / got description=6a68d8cc-4338-40a5-a8f7-b8d1860d6535) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect e493d64d-5774-415b-80d8-a5f6f59e1350 / got description=f00058aa-656f-4cb3-abfe-658c6e14503d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 23ac3bbc-0d43-4a87-8375-2eed3cb77d1d / got description=e493d64d-5774-415b-80d8-a5f6f59e1350) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect bc29d08d-a181-41fb-bdd0-f74b94ec86f3 / got description=23ac3bbc-0d43-4a87-8375-2eed3cb77d1d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 07c7a6d7-7827-43ba-96b0-a15edaa65a27 / got description=bc29d08d-a181-41fb-bdd0-f74b94ec86f3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect dab714ea-be92-45d3-a091-86bd69765f4e / got description=07c7a6d7-7827-43ba-96b0-a15edaa65a27) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect dea963c6-32a3-45b2-bafd-cf8315c271ca / got description=dab714ea-be92-45d3-a091-86bd69765f4e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 598fd048-f206-4eb5-957d-e6133aa35897 / got description=dea963c6-32a3-45b2-bafd-cf8315c271ca) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 0b9d8930-e2c1-4a69-94a4-72dbb29d71bc / got description=598fd048-f206-4eb5-957d-e6133aa35897) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:181 -------------- FINISH RESET of m4 -----------------
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:204 Running test_clean... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:207 test_clean: disable master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:71 test_clean: remove all the agreements to master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:211 test_clean: run the cleanAllRUV task... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:221 test_clean: check all the masters have been cleaned... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:1 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:2 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:3 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:225 test_clean PASSED, restoring master 4...
Passed suites/replication/cleanallruv_test.py::test_clean_restart 23.74
-------------------------------Captured log setup-------------------------------
DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:153 Wait a bit before the reset - it is required for the slow machines DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:155 -------------- BEGIN RESET of m4 ----------------- INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 91517cc5-1389-40b8-88af-eac4acee701e / got description=6d98fd23-b029-4cf7-8af3-09045d403f8e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect d7dfc440-5cba-400a-baa6-d62d20360992 / got description=91517cc5-1389-40b8-88af-eac4acee701e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect fc7f5abf-263d-477b-a236-df919dbbbd59 / got description=d7dfc440-5cba-400a-baa6-d62d20360992) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 372422fe-c134-4c3c-8705-175a01634b94 / got description=fc7f5abf-263d-477b-a236-df919dbbbd59) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect cb394ea4-8f02-4f24-ae77-f1b194014754 / got description=372422fe-c134-4c3c-8705-175a01634b94) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 02865025-4885-4426-b358-1e7e0e27a0da / got description=cb394ea4-8f02-4f24-ae77-f1b194014754) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect bcf89d0c-87de-43fa-aa3c-c0ce8cbbcaf6 / got description=02865025-4885-4426-b358-1e7e0e27a0da) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 88c8ceb7-8d27-4975-9e03-d162125bb04a / got description=7122f6e1-0858-4ef6-8914-3a5ec860abf9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6855ea6f-d6f4-44af-a217-b472a9d43cff / got description=88c8ceb7-8d27-4975-9e03-d162125bb04a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c76fa1b1-ad36-4f92-af89-07f2e5fcc370 / got description=6855ea6f-d6f4-44af-a217-b472a9d43cff) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 8590b280-2a37-4696-9200-2904fe082e4d / got description=c76fa1b1-ad36-4f92-af89-07f2e5fcc370) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working DEBUG  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:181 -------------- FINISH RESET of m4 -----------------
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:256 Running test_clean_restart... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:259 test_clean: disable master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:71 test_clean: remove all the agreements to master 4... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:267 test_clean: run the cleanAllRUV task... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:292 test_clean_restart: check all the masters have been cleaned... INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:1 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:2 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:85 check_ruvs for replica dc=example,dc=com:3 (suffix:rid) INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:296 test_clean_restart PASSED, restoring master 4...
Passed suites/replication/cleanallruv_test.py::test_clean_shutdown_crash 35.47
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3855b3b8-7854-4592-a4ac-e4308ad03424 / got description=None) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3855b3b8-7854-4592-a4ac-e4308ad03424 / got description=None) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 3855b3b8-7854-4592-a4ac-e4308ad03424 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2ed7ed81-2b43-417f-a078-fc0eab9ca004 / got description=3855b3b8-7854-4592-a4ac-e4308ad03424) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:771 Enabling TLS INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:774 Creating replication dns INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:782 Changing auth type INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:803 Stopping master2 INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:806 Run the cleanAllRUV task INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:817 Check if master1 crashed INFO  tests.suites.replication.cleanallruv_test:cleanallruv_test.py:820 Repeat
Passed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_add_modrdn 19.88
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 192703b8-f2ba-4abf-814f-b6dcf5517c9a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0b174f97-6c93-4384-b8d8-28943aedeef4 / got description=192703b8-f2ba-4abf-814f-b6dcf5517c9a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 73ecc419-de4a-4409-9d98-6af38745d511 / got description=0b174f97-6c93-4384-b8d8-28943aedeef4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 73ecc419-de4a-4409-9d98-6af38745d511 / got description=0b174f97-6c93-4384-b8d8-28943aedeef4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 73ecc419-de4a-4409-9d98-6af38745d511 / got description=0b174f97-6c93-4384-b8d8-28943aedeef4) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 73ecc419-de4a-4409-9d98-6af38745d511 / got description=0b174f97-6c93-4384-b8d8-28943aedeef4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:210 Test create - modrdn INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 27a05278-6439-4f04-88c2-14280765f463 / got description=73ecc419-de4a-4409-9d98-6af38745d511) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 27a05278-6439-4f04-88c2-14280765f463 / got description=73ecc419-de4a-4409-9d98-6af38745d511) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 27a05278-6439-4f04-88c2-14280765f463 / got description=73ecc419-de4a-4409-9d98-6af38745d511) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d01c440b-0b62-4d1c-96b4-cfee0878540b / got description=27a05278-6439-4f04-88c2-14280765f463) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_conflict_attribute_multi_valued 11.18
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:833 Check foo1 is on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:833 Check foo2 is on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:839 Check foo1 is on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:839 Check foo2 is on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:844 Check M1.uid foo1 is also on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:844 Check M1.uid foo2 is also on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:848 Check M2.uid foo1 is also on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:848 Check M2.uid foo2 is also on M1
Passed suites/replication/conflict_resolve_test.py::TestTwoMasters::test_conflict_attribute_single_valued 11.41
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:921 Check foo1 is on M1 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:927 Check foo1 is on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:932 Check M1.uid foo1 is also on M2 INFO  tests.suites.replication.conflict_resolve_test:conflict_resolve_test.py:936 Check M2.uid foo1 is also on M1
Passed suites/replication/encryption_cl5_test.py::test_algorithm_unhashed 48.83
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5c47b275-f6ee-44f8-b9bd-00333fc6871d / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2a052333-a22e-4a32-8780-68891247e10d / got description=5c47b275-f6ee-44f8-b9bd-00333fc6871d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect a705f808-154b-4a43-82f4-1321d87320fe / got description=2a052333-a22e-4a32-8780-68891247e10d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:47 Configuring changelog encryption:master1 for: AES INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master2/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master2/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master2/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master1/db/userRoot/replication_changelog.db'] INFO  tests.suites.replication.encryption_cl5_test:encryption_cl5_test.py:64 Running dbscan -f to check unhashed#user#password attr INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-master2/db/userRoot/replication_changelog.db']
Passed suites/replication/multiple_changelogs_test.py::test_multiple_changelogs 12.76
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect bf43b98b-fa1d-4a1b-8327-3b7289a3a593 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 00c8d033-e7c1-4abb-b739-d515eb18a108 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201
Passed suites/replication/multiple_changelogs_test.py::test_multiple_changelogs_export_import 10.86
No log output captured.
Passed suites/replication/regression_test.py::test_special_symbol_replica_agreement 0.67
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=example,dc=com INFO  lib389:__init__.py:1713 Found entry dn: cn=replication manager,cn=config cn: bind dn pseudo user cn: replication manager objectClass: top objectClass: person sn: bind dn pseudo user userPassword: {PBKDF2_SHA256}AAAIAO0KIBBqR3878oZo0UWHvaGwd2GfXb6TgIwNqMH2EGkEFnUrUpy9ry1IPQcU1QwBbjWIcLDJ5rtHTWWpgcdn2YbJWk8GUgbFgOJGH2Z/SsHA3+cT9YbbSJscXobu8gkmrTsAP828xnDcOsE9K9YF98r4WKw7m9YxKQ4QXG4WcrtUh1nfoaxCkiBndVYdBKk5s4VkAap0vk6euaw5gGWw3zVB0mhS1s/lRHgCmm8AKky3ODywzpoLKZapgu8bdZoVQkv/1WOTAQcpZ38ecqy+gNlkS3NTbxsDf5+X5zzyZQCCQR6dBpPR7osY0UHZGwVZPb4UAhZiqc48sz0SEh/6dgTS8ZLzJEwevdKXgR/nhkYzfVAM9TBzKthIU15l86sSIVppIvyTJmGxeC3ojBizUdlNDZSF+XkkTCf2jnKlNS67 INFO  lib389:agreement.py:1169 Starting total init cn=-3meTo_ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38902,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config
Passed suites/replication/regression_test.py::test_double_delete 2.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bc257d9f-c09a-4bde-bec2-44c13b27ae88 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 9ae971ba-9b78-4cc1-bd8b-96923f162cc8 / got description=bc257d9f-c09a-4bde-bec2-44c13b27ae88) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  tests.suites.replication.regression_test:regression_test.py:141 Adding a test entry user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:268 Deleting entry uid=testuser,ou=People,dc=example,dc=com from master1 INFO  tests.suites.replication.regression_test:regression_test.py:271 Deleting entry uid=testuser,ou=People,dc=example,dc=com from master2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1b8b9df7-d33c-427d-88aa-61c1bbafc023 / got description=9ae971ba-9b78-4cc1-bd8b-96923f162cc8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2c65ff04-cb35-40ba-9808-f6dc3d30ac05 / got description=1b8b9df7-d33c-427d-88aa-61c1bbafc023) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/regression_test.py::test_repl_modrdn 6.53
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:314 Add test entries - Add 3 OUs and 2 same users under 2 different OUs INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bfca675a-e2cb-434b-9124-4f3d01434553 / got description=2c65ff04-cb35-40ba-9808-f6dc3d30ac05) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 55f45bc1-2dd8-4718-8bbc-51142f63299c / got description=bfca675a-e2cb-434b-9124-4f3d01434553) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  tests.suites.replication.regression_test:regression_test.py:338 Stop Replication INFO  tests.suites.replication.regression_test:regression_test.py:341 Apply modrdn to M1 - move test user from OU A -> C INFO  tests.suites.replication.regression_test:regression_test.py:344 Apply modrdn on M2 - move test user from OU B -> C INFO  tests.suites.replication.regression_test:regression_test.py:347 Start Replication INFO  tests.suites.replication.regression_test:regression_test.py:350 Wait for sometime for repl to resume INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 9840132f-5f01-4726-ba20-3371d0f8381d / got description=55f45bc1-2dd8-4718-8bbc-51142f63299c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8b721515-8db7-45fc-bffa-0a24497b1cb5 / got description=9840132f-5f01-4726-ba20-3371d0f8381d) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  tests.suites.replication.regression_test:regression_test.py:354 Check that there should be only one test entry under ou=C on both masters INFO  tests.suites.replication.regression_test:regression_test.py:361 Check that the replication is working fine both ways, M1 <-> M2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect ec80f65a-6123-4511-a229-89db692b7a62 / got description=8b721515-8db7-45fc-bffa-0a24497b1cb5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect f101313e-7750-4306-921a-4508e55b2724 / got description=ec80f65a-6123-4511-a229-89db692b7a62) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/regression_test.py::test_password_repl_error 13.62
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:141 Adding a test entry user
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:385 Clean the error log INFO  tests.suites.replication.regression_test:regression_test.py:388 Set replication loglevel INFO  tests.suites.replication.regression_test:regression_test.py:391 Modifying entry uid=testuser,ou=People,dc=example,dc=com - change userpassword on master 1 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 738086fa-ae97-446a-8f02-864cd3016d19 / got description=f101313e-7750-4306-921a-4508e55b2724) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  tests.suites.replication.regression_test:regression_test.py:398 Restart the servers to flush the logs INFO  tests.suites.replication.regression_test:regression_test.py:403 Check that password works on master 2 INFO  tests.suites.replication.regression_test:regression_test.py:407 Check the error log for the error with uid=testuser,ou=People,dc=example,dc=com INFO  tests.suites.replication.regression_test:regression_test.py:410 Set the default loglevel
Passed suites/replication/regression_test.py::test_invalid_agmt 2.30
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8745b4f7-1720-4808-a3c3-75c41d7676af / got description=738086fa-ae97-446a-8f02-864cd3016d19) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7ce716f5-6ff2-4246-a933-7c5df7c03a17 / got description=8745b4f7-1720-4808-a3c3-75c41d7676af) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/regression_test.py::test_fetch_bindDnGroup 22.68
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d54067ac-8350-4168-82fe-d769d48ef8dd / got description=7ce716f5-6ff2-4246-a933-7c5df7c03a17) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/regression_test.py::test_plugin_bind_dn_tracking_and_replication 1.02
No log output captured.
Passed suites/replication/regression_test.py::test_cleanallruv_repl 119.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7fff5bfb-27e0-460c-a9b2-59be6e4e67e1 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect b6738026-d1a3-43e5-806d-d3f1a27558b3 / got description=7fff5bfb-27e0-460c-a9b2-59be6e4e67e1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 67a4cc1f-b328-43f5-94dc-77f1ccd0dce3 / got description=b6738026-d1a3-43e5-806d-d3f1a27558b3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0cd3242c-51d6-45b1-9cb6-723c560912df / got description=67a4cc1f-b328-43f5-94dc-77f1ccd0dce3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:680 Change the error log levels for all masters INFO  tests.suites.replication.regression_test:regression_test.py:684 Get the replication agreements for all 3 masters INFO  tests.suites.replication.regression_test:regression_test.py:689 Modify nsslapd-changelogmaxage=30 and nsslapd-changelogtrim-interval=5 for M1 and M2 INFO  tests.suites.replication.regression_test:regression_test.py:712 Add test users to 3 masters INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:tasks.py:1400 cleanAllRUV task (task-10302020_231716) completed successfully INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config
Passed suites/replication/regression_test.py::test_online_reinit_may_hang 18.08
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7fc4c198-4a4c-40cc-b2f3-cb0218992ef3 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 79c0bee8-5ebd-4383-a254-671732266771 / got description=7fc4c198-4a4c-40cc-b2f3-cb0218992ef3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-master1/ldif/master1.ldif
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b12af79c-4ab2-43c1-a5bd-d6d96edbd82f / got description=79c0bee8-5ebd-4383-a254-671732266771) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 170930ac-f0be-49eb-a4ce-c91866af1d52 / got description=b12af79c-4ab2-43c1-a5bd-d6d96edbd82f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working
Passed suites/replication/regression_test.py::test_moving_entry_make_online_init_fail 1.59
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 726a05b8-16ef-48a1-b62d-377da942ab0a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 23210f85-ae0b-4754-bb62-d4fd837c7e2e / got description=726a05b8-16ef-48a1-b62d-377da942ab0a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.regression_test:regression_test.py:862 Generating DIT_0 INFO  tests.suites.replication.regression_test:regression_test.py:865 Created entry: ou=OU0, dc=example, dc=com INFO  tests.suites.replication.regression_test:regression_test.py:870 Created entry: ou=OU0, ou=OU0, dc=example, dc=com INFO  tests.suites.replication.regression_test:regression_test.py:191 Create password policy for subtree ou=OU0,dc=example,dc=com INFO  tests.suites.replication.regression_test:regression_test.py:879 Turning tuser0 into a tombstone entry INFO  tests.suites.replication.regression_test:regression_test.py:879 Turning tuser2 into a tombstone entry INFO  tests.suites.replication.regression_test:regression_test.py:879 Turning tuser4 into a tombstone entry INFO  tests.suites.replication.regression_test:regression_test.py:879 Turning tuser6 into a tombstone entry INFO  tests.suites.replication.regression_test:regression_test.py:879 Turning tuser8 into a tombstone entry INFO  tests.suites.replication.regression_test:regression_test.py:882 dc=example,dc=com => ou=OU0,dc=example,dc=com => ou=OU0,ou=OU0,dc=example,dc=com => 10 USERS INFO  tests.suites.replication.regression_test:regression_test.py:884 Generating DIT_1 INFO  tests.suites.replication.regression_test:regression_test.py:887 Created entry: ou=OU1,dc=example,dc=com INFO  tests.suites.replication.regression_test:regression_test.py:891 Created entry: ou=OU1, ou=OU1, dc=example, dc=com INFO  tests.suites.replication.regression_test:regression_test.py:191 Create password policy for subtree ou=OU1,dc=example,dc=com INFO  tests.suites.replication.regression_test:regression_test.py:895 Moving ou=OU0,ou=OU0,dc=example,dc=com to DIT_1 INFO  tests.suites.replication.regression_test:regression_test.py:898 Moving ou=OU0,dc=example,dc=com to DIT_1 INFO  tests.suites.replication.regression_test:regression_test.py:905 Moving USERS to ou=OU0,ou=OU0,ou=OU1,dc=example,dc=com INFO  tests.suites.replication.regression_test:regression_test.py:910 dc=example,dc=com => ou=OU1,dc=example,dc=com => ou=OU0,ou=OU1,dc=example,dc=com => ou=OU0,ou=OU0,ou=OU1,dc=example,dc=com => 10 USERS INFO  tests.suites.replication.regression_test:regression_test.py:912 Run Initialization. INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a4ec3557-4213-467b-b901-874b8475aedc / got description=23210f85-ae0b-4754-bb62-d4fd837c7e2e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  tests.suites.replication.regression_test:regression_test.py:921 m1entry count - 46 INFO  tests.suites.replication.regression_test:regression_test.py:922 m2entry count - 46
Passed suites/replication/repl_agmt_bootstrap_test.py::test_repl_agmt_bootstrap_credentials 15.39
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c4c46e9f-f648-4283-bf7b-960f379db212 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect bb993c23-1406-4f98-8368-0dd614e6ce39 / got description=c4c46e9f-f648-4283-bf7b-960f379db212) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaType--1-4-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.92
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5Flags--1-2-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.18
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaId-0-65536-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.18
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaPurgeDelay--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.22
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsDS5ReplicaBindDnGroupCheckInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.14
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaTombstonePurgeInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.16
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.16
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaReleaseTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.16
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaBackoffMin-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-3] 0.13
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_add[nsds5ReplicaBackoffMax-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.19
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsDS5Flags--1-2-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.17
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaPurgeDelay--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.10
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsDS5ReplicaBindDnGroupCheckInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.12
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaTombstonePurgeInterval--2-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.11
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaProtocolTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.14
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaReleaseTimeout--1-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-1] 0.14
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaBackoffMin-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-3] 0.11
No log output captured.
Passed suites/replication/replica_config_test.py::test_replica_num_modify[nsds5ReplicaBackoffMax-0-9223372036854775807-9999999999999999999999999999999999999999999999999999999999999999999-invalid-6] 0.11
No log output captured.
Passed suites/replication/replica_config_test.py::test_same_attr_yields_same_return_code 0.22
No log output captured.
Passed suites/replication/ruvstore_test.py::test_ruv_entry_backup 8.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c6096161-7689-4b53-9850-4efdf59bbfc3 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 5a7e0fa8-47bc-4b97-9b66-dc8b973a6a8a / got description=c6096161-7689-4b53-9850-4efdf59bbfc3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-master1/ldif/master1.ldif
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:102 LDAP operations add, modify, modrdn and delete INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:58 Adding user to master1 INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:61 Modify RDN of user: uid=rep2lusr,ou=People,dc=example,dc=com INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:68 Deleting user: uid=ruvusr,ou=people,dc=example,dc=com INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:106 Stopping the server instance to run db2ldif task to create backup file INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:110 Starting the server after backup INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:113 Checking if backup file contains RUV and required attributes INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:51 Attribute found in RUV: objectClass INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:51 Attribute found in RUV: nsUniqueId INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:51 Attribute found in RUV: nsds50ruv INFO  tests.suites.replication.ruvstore_test:ruvstore_test.py:51 Attribute found in RUV: nsruvReplicaLastModified
Passed suites/replication/series_of_repl_bugs_test.py::test_deletions_are_not_replicated 10.91
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 78d40432-bfab-4292-8770-bbc3dbf93791 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8ec0b327-e3ed-4f75-a850-d067b6d29605 / got description=78d40432-bfab-4292-8770-bbc3dbf93791) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b790a2db-dd4c-4604-af78-509dea4f4b3a / got description=8ec0b327-e3ed-4f75-a850-d067b6d29605) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d1dc3c6a-3225-45a4-a9d4-255a35c31683 / got description=b790a2db-dd4c-4604-af78-509dea4f4b3a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/series_of_repl_bugs_test.py::test_error_20 1.09
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5e2ed1c1-f20e-4993-9869-f58a11f7137b / got description=d1dc3c6a-3225-45a4-a9d4-255a35c31683) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/series_of_repl_bugs_test.py::test_segfaults 0.06
No log output captured.
Passed suites/replication/series_of_repl_bugs_test.py::test_adding_deleting 0.11
No log output captured.
Passed suites/replication/series_of_repl_bugs_test.py::test_deleting_twice 2.13
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect eca2a710-e332-40f1-9709-7087dd99ad57 / got description=5e2ed1c1-f20e-4993-9869-f58a11f7137b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dd4dde0c-690d-4832-8d3d-7f23fd769046 / got description=eca2a710-e332-40f1-9709-7087dd99ad57) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/series_of_repl_bugs_test.py::test_rename_entry 2.32
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 411fa545-e372-4d93-996d-5520eeef11f9 / got description=dd4dde0c-690d-4832-8d3d-7f23fd769046) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 59b7ee34-cd56-4b83-8ad7-e715fc554a4a / got description=411fa545-e372-4d93-996d-5520eeef11f9) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/series_of_repl_bugs_test.py::test_userpassword_attribute 3.11
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8f2af0f5-33bc-457a-b9ee-49ff2848e8cd / got description=59b7ee34-cd56-4b83-8ad7-e715fc554a4a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c7ff7576-eff0-4ed3-9bc1-4f6a20d5d08e / got description=8f2af0f5-33bc-457a-b9ee-49ff2848e8cd) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed suites/replication/series_of_repl_bugs_test.py::test_tombstone_modrdn 0.16
No log output captured.
Passed suites/replication/single_master_test.py::test_mail_attr_repl 14.77
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect ef48e66f-5d82-4958-aab8-84ce23eaa5f5 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.single_master_test:single_master_test.py:67 Check that replication is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect c5e5252f-0f1b-42cf-b472-4b86c5760d72 / got description=ef48e66f-5d82-4958-aab8-84ce23eaa5f5) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  tests.suites.replication.single_master_test:single_master_test.py:83 Back up /var/lib/dirsrv/slapd-consumer1/db/userRoot/mail.db to /tmp/mail.db INFO  tests.suites.replication.single_master_test:single_master_test.py:87 Remove 'mail' attr from master INFO  tests.suites.replication.single_master_test:single_master_test.py:90 Wait for the replication to happen INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6da78659-349b-4253-86c4-e3972d9dc93c / got description=c5e5252f-0f1b-42cf-b472-4b86c5760d72) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6da78659-349b-4253-86c4-e3972d9dc93c / got description=c5e5252f-0f1b-42cf-b472-4b86c5760d72) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6da78659-349b-4253-86c4-e3972d9dc93c / got description=c5e5252f-0f1b-42cf-b472-4b86c5760d72) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 6da78659-349b-4253-86c4-e3972d9dc93c / got description=c5e5252f-0f1b-42cf-b472-4b86c5760d72) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  tests.suites.replication.single_master_test:single_master_test.py:94 Restore /tmp/mail.db to /var/lib/dirsrv/slapd-consumer1/db/userRoot/mail.db INFO  tests.suites.replication.single_master_test:single_master_test.py:98 Make a search for mail attribute in attempt to crash server INFO  tests.suites.replication.single_master_test:single_master_test.py:101 Make sure that server hasn't crashed INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4d4c2006-9e42-4573-8cb6-c2c1c98a2580 / got description=6da78659-349b-4253-86c4-e3972d9dc93c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4d4c2006-9e42-4573-8cb6-c2c1c98a2580 / got description=6da78659-349b-4253-86c4-e3972d9dc93c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4d4c2006-9e42-4573-8cb6-c2c1c98a2580 / got description=6da78659-349b-4253-86c4-e3972d9dc93c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4d4c2006-9e42-4573-8cb6-c2c1c98a2580 / got description=6da78659-349b-4253-86c4-e3972d9dc93c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
Passed suites/replication/single_master_test.py::test_lastupdate_attr_before_init 0.72
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38902, 'ldap-secureport': 63602, 'server-id': 'standalone2', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38901 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:38902 is was created
Passed suites/replication/tls_client_auth_repl_test.py::test_ssl_transport 8.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect bb0bd439-bba3-4565-8319-05a741f43fbf / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8b7f9d6d-6d54-4b80-b3d4-429ba2ea3840 / got description=bb0bd439-bba3-4565-8319-05a741f43fbf) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 8cf20571-74a7-44d7-90f7-31fb00949f91 / got description=8b7f9d6d-6d54-4b80-b3d4-429ba2ea3840) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 4a4854fd-3143-4e4d-8692-5495029aac68 / got description=8cf20571-74a7-44d7-90f7-31fb00949f91) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 96e7e902-c9cb-4bff-84a6-99c389d3533e / got description=4a4854fd-3143-4e4d-8692-5495029aac68) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 800b34fe-9014-4fb3-a45f-5249899420b3 / got description=96e7e902-c9cb-4bff-84a6-99c389d3533e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect fb449d07-be82-4516-bb16-88e79516e561 / got description=800b34fe-9014-4fb3-a45f-5249899420b3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 60a207e6-9791-49f8-9274-a659a185bf48 / got description=fb449d07-be82-4516-bb16-88e79516e561) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect fef8c415-009e-4342-a856-2258923fe786 / got description=60a207e6-9791-49f8-9274-a659a185bf48) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect d870a37f-de25-4365-b263-c64696a62021 / got description=fef8c415-009e-4342-a856-2258923fe786) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 99c85da0-955f-445f-801a-9f2c8ec401b7 / got description=d870a37f-de25-4365-b263-c64696a62021) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 31d0cd63-2da7-425f-8c21-b44c609961a4 / got description=99c85da0-955f-445f-801a-9f2c8ec401b7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 4966c9b9-9a6c-4131-b75c-83fe9c7d8083 / got description=31d0cd63-2da7-425f-8c21-b44c609961a4) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working
Passed suites/replication/tls_client_auth_repl_test.py::test_extract_pemfiles 4.13
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.tls_client_auth_repl_test:tls_client_auth_repl_test.py:168 Check that nsslapd-extract-pemfiles is on INFO  tests.suites.replication.tls_client_auth_repl_test:tls_client_auth_repl_test.py:172 Set nsslapd-extract-pemfiles = '{}' and check replication works) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect cb8040f8-f10c-4f83-afb9-c1a4f7ddbac1 / got description=4966c9b9-9a6c-4131-b75c-83fe9c7d8083) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 1365e354-3ba0-4174-a071-50859976d833 / got description=cb8040f8-f10c-4f83-afb9-c1a4f7ddbac1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  tests.suites.replication.tls_client_auth_repl_test:tls_client_auth_repl_test.py:172 Set nsslapd-extract-pemfiles = '{}' and check replication works) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 280e2e09-3f13-400c-b4f8-14ee13610430 / got description=1365e354-3ba0-4174-a071-50859976d833) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 7c73cbb4-53aa-43d8-83f4-88675b534f3b / got description=280e2e09-3f13-400c-b4f8-14ee13610430) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working
Passed suites/replication/tombstone_fixup_test.py::test_precise_tombstone_purging 23.61
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:567 Export task export_10302020_232440 for file /var/lib/dirsrv/slapd-master1/ldif/export.ldif completed successfully INFO  lib389.utils:tombstone_fixup_test.py:77 Import replication LDIF file... INFO  lib389:tasks.py:498 Import task import_10302020_232444 for file /var/lib/dirsrv/slapd-master1/ldif/export.ldif completed successfully INFO  lib389:tasks.py:937 tombstone fixup task fixupTombstone_10302020_232447 for backend userRoot completed successfully INFO  lib389:tasks.py:937 tombstone fixup task fixupTombstone_10302020_232449 for backend userRoot completed successfully INFO  lib389.utils:tombstone_fixup_test.py:116 Wait for tombstone purge interval to pass... INFO  lib389.utils:tombstone_fixup_test.py:123 Wait for tombstone purge interval to pass again...
Passed suites/replication/tombstone_test.py::test_purge_success 0.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  Tombstone:tombstone.py:165 Reviving nsuniqueid=b12b9f86-1b2811eb-ae88a24e-1ea071ad,uid=testuser,ou=people,dc=example,dc=com -> uid=testuser,ou=people,dc=example,dc=com
Passed suites/replication/wait_for_async_feature_test.py::test_not_int_value 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 6334bf28-edd5-4dcf-9e25-bbd110816986 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 49b8f503-95fb-458d-8afb-081ac251f6b4 / got description=6334bf28-edd5-4dcf-9e25-bbd110816986) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
Passed suites/replication/wait_for_async_feature_test.py::test_multi_value 0.37
No log output captured.
Passed suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr0] 0.01
No log output captured.
Passed suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr1] 0.00
No log output captured.
Passed suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr2] 0.01
No log output captured.
Passed suites/replication/wait_for_async_feature_test.py::test_value_check[waitfor_async_attr3] 0.01
No log output captured.
Passed suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr0] 20.04
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:63 Add 100 nested entries under replicated suffix on master1 INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:71 Delete created entries
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:170 Set Replication Debugging loglevel for the errorlog INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:180 Gather all sync attempts within Counter dict, group by timestamp INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:201 Take the most common timestamp and assert it has appeared in the range from 4 to 11 times DEBUG  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:204 4 <= 9 <= 11
Passed suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr1] 20.08
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:63 Add 100 nested entries under replicated suffix on master1 INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:71 Delete created entries
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:170 Set Replication Debugging loglevel for the errorlog INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:180 Gather all sync attempts within Counter dict, group by timestamp INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:201 Take the most common timestamp and assert it has appeared in the range from 0 to 2 times DEBUG  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:204 0 <= 1 <= 2
Passed suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr2] 20.16
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:63 Add 100 nested entries under replicated suffix on master1 INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:71 Delete created entries
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:170 Set Replication Debugging loglevel for the errorlog INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:180 Gather all sync attempts within Counter dict, group by timestamp INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:201 Take the most common timestamp and assert it has appeared in the range from 4 to 11 times DEBUG  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:204 4 <= 8 <= 11
Passed suites/replication/wait_for_async_feature_test.py::test_behavior_with_value[waitfor_async_attr3] 20.14
-------------------------------Captured log setup-------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:63 Add 100 nested entries under replicated suffix on master1 INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:71 Delete created entries
-------------------------------Captured log call--------------------------------
INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:170 Set Replication Debugging loglevel for the errorlog INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:180 Gather all sync attempts within Counter dict, group by timestamp INFO  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:201 Take the most common timestamp and assert it has appeared in the range from 4 to 11 times DEBUG  tests.suites.replication.wait_for_async_feature_test:wait_for_async_feature_test.py:204 4 <= 6 <= 11
Passed suites/resource_limits/fdlimits_test.py::test_fd_limits 0.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.resource_limits.fdlimits_test:fdlimits_test.py:69 Test PASSED
Passed suites/rewriters/adfilter_test.py::test_adfilter_objectCategory 5.94
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/rewriters/basic_test.py::test_rewriters_container 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/rewriters/basic_test.py::test_foo_filter_rewriter 4.56
No log output captured.
Passed suites/roles/basic_test.py::test_filterrole 0.85
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/roles/basic_test.py::test_managedrole 0.20
No log output captured.
Passed suites/roles/basic_test.py::test_nestedrole 0.29
No log output captured.
Passed suites/sasl/allowed_mechs_test.py::test_basic_feature 38.79
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stdout call------------------------------
['EXTERNAL', 'GSS-SPNEGO', 'GSSAPI', 'DIGEST-MD5', 'CRAM-MD5', 'LOGIN', 'PLAIN', 'ANONYMOUS']
-------------------------------Captured log call--------------------------------
INFO  lib389:allowed_mechs_test.py:75 Test we have some of the default mechanisms INFO  lib389:allowed_mechs_test.py:83 Edit mechanisms to allow just PLAIN INFO  lib389:allowed_mechs_test.py:91 Restart server and make sure we still have correct allowed mechs INFO  lib389:allowed_mechs_test.py:100 Edit mechanisms to allow just PLAIN and EXTERNAL INFO  lib389:allowed_mechs_test.py:108 Edit mechanisms to allow just PLAIN and GSSAPI INFO  lib389:allowed_mechs_test.py:126 Edit mechanisms to allow just PLAIN, GSSAPI, and ANONYMOUS INFO  lib389:allowed_mechs_test.py:146 Edit mechanisms to allow just PLAIN and ANONYMOUS INFO  lib389:allowed_mechs_test.py:165 Reset allowed mechaisms INFO  lib389:allowed_mechs_test.py:169 Check that we have the original set of mechanisms INFO  lib389:allowed_mechs_test.py:174 Check that we have the original set of mechanisms after a restart
Passed suites/sasl/allowed_mechs_test.py::test_config_set_few_mechs 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:allowed_mechs_test.py:198 Set nsslapd-allowed-sasl-mechanisms to 'PLAIN GSSAPI' INFO  lib389:allowed_mechs_test.py:201 Verify nsslapd-allowed-sasl-mechanisms has the values
Passed suites/sasl/plain_test.py::test_basic_feature 11.84
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/sasl/regression_test.py::test_openldap_no_nss_crypto 38.76
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f4a1b651-7b73-4724-93f6-e52365d35e08 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 7247489b-f1f5-4d17-922f-f00048cc02e7 / got description=f4a1b651-7b73-4724-93f6-e52365d35e08) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.sasl.regression_test:regression_test.py:133 Ticket 47536 - Allow usage of OpenLDAP libraries that don't use NSS for crypto INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 207d43cc-8e2b-4457-bb7d-f2d581f40c87 / got description=7247489b-f1f5-4d17-922f-f00048cc02e7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  tests.suites.sasl.regression_test:regression_test.py:35 ######################### Adding 5 entries to master1 ###################### INFO  tests.suites.sasl.regression_test:regression_test.py:35 ######################### Adding 5 entries to master2 ###################### INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect 79b17c40-4cb0-4a74-a675-c3bbfa50098f / got description=207d43cc-8e2b-4457-bb7d-f2d581f40c87) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect b7ede189-8788-4f62-b721-2afc42bb0b97 / got description=79b17c40-4cb0-4a74-a675-c3bbfa50098f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  tests.suites.sasl.regression_test:regression_test.py:146 ##### Searching for entries on master1... INFO  tests.suites.sasl.regression_test:regression_test.py:150 ##### Searching for entries on master2... INFO  tests.suites.sasl.regression_test:regression_test.py:92 ######################### Relocate PEM files on master1 ###################### INFO  tests.suites.sasl.regression_test:regression_test.py:100 ##### restart master1 INFO  tests.suites.sasl.regression_test:regression_test.py:47 ######################### Check PEM files (/dev/shm/MyCA, /dev/shm/MyServerCert1, /dev/shm/MyServerKey1) in /dev/shm ###################### INFO  tests.suites.sasl.regression_test:regression_test.py:53 /dev/shm/MyCA.pem is successfully generated. INFO  tests.suites.sasl.regression_test:regression_test.py:66 /dev/shm/MyServerCert1.pem is successfully generated. INFO  tests.suites.sasl.regression_test:regression_test.py:79 /dev/shm/MyServerKey1.pem is successfully generated. INFO  tests.suites.sasl.regression_test:regression_test.py:35 ######################### Adding 5 entries to master1 ###################### INFO  tests.suites.sasl.regression_test:regression_test.py:35 ######################### Adding 5 entries to master2 ###################### INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is NOT working (expect cc19f718-f2eb-421f-b07c-2dfd658ac43c / got description=b7ede189-8788-4f62-b721-2afc42bb0b97) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 65f72182-3799-4a26-82d2-60569f37278d / got description=cc19f718-f2eb-421f-b07c-2dfd658ac43c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is NOT working (expect 65f72182-3799-4a26-82d2-60569f37278d / got description=cc19f718-f2eb-421f-b07c-2dfd658ac43c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702 to ldaps://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701 is working INFO  tests.suites.sasl.regression_test:regression_test.py:162 ##### Searching for entries on master1... INFO  tests.suites.sasl.regression_test:regression_test.py:166 ##### Searching for entries on master2... INFO  lib389:tasks.py:567 Export task export_10302020_233236 for file /var/lib/dirsrv/slapd-master1/ldif/master1.ldif completed successfully INFO  tests.suites.sasl.regression_test:regression_test.py:173 Ticket 47536 - PASSED
Passed suites/schema/eduperson_test.py::test_account_locking 0.84
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.eduperson_test:eduperson_test.py:88 Test PASSED
Passed suites/schema/schema_reload_test.py::test_schema_reload_with_searches 1.56
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:49 Test the searches still work as expected during schema reload tasks
Passed suites/schema/schema_reload_test.py::test_invalid_schema 2.19
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:234 Test schema-reload task with invalid schema INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:237 Create valid schema file (98user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:252 Create invalid schema file (99user.ldif)... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:269 Run the schema-reload task, it should fail... INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:276 Check cn=schema to verify the invalid schema was not added INFO  tests.suites.schema.schema_reload_test:schema_reload_test.py:283 The invalid schema is not present on the server
Passed suites/schema/schema_replication_test.py::test_schema_replication_one 9.46
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b785bb7e-1f9a-4dc5-bc26-ec7b516387c5 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:182 test_schema_replication_init topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c3135ee0> (master <lib389.DirSrv object at 0x7f61c32b0340>, consumer <lib389.DirSrv object at 0x7f61c2f9f400>
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Extra OC Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:222 test_schema_replication_one topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c3135ee0> (master <lib389.DirSrv object at 0x7f61c32b0340>, consumer <lib389.DirSrv object at 0x7f61c2f9f400> DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive 0 (expected 1) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'1' (expected 2) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:233 test_schema_replication_one master_schema_csn=b'5f9cdb19000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:234 ctest_schema_replication_one onsumer_schema_csn=b'5f9cdb19000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38] 389-Directory/2.0.0.0 B2020.305.0012 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95] localhost.localdomain:39001 (/etc/dirsrv/slapd-master1) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [200] [30/Oct/2020:23:33:20.408493554 -0400] - INFO - main - 389-Directory/2.0.0.0 B2020.305.0012 starting up DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [308] [30/Oct/2020:23:33:20.412919381 -0400] - INFO - main - Setting the maximum file descriptor limit to: 524288 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [416] [30/Oct/2020:23:33:21.246037340 -0400] - INFO - PBKDF2_SHA256 - Based on CPU performance, chose 2048 rounds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [530] [30/Oct/2020:23:33:21.256435177 -0400] - INFO - bdb_config_upgrade_dse_info - create config entry from old config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [632] [30/Oct/2020:23:33:21.269266373 -0400] - NOTICE - bdb_start_autotune - found 7980868k physical memory DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [728] [30/Oct/2020:23:33:21.273887832 -0400] - NOTICE - bdb_start_autotune - found 7314356k available DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [835] [30/Oct/2020:23:33:21.279934394 -0400] - NOTICE - bdb_start_autotune - cache autosizing: db cache: 498804k DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [938] [30/Oct/2020:23:33:21.285573785 -0400] - NOTICE - bdb_start_autotune - total cache size: 408620441 B; DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1074] [30/Oct/2020:23:33:21.533620072 -0400] - INFO - slapd_daemon - slapd started. Listening on All Interfaces port 39001 for LDAP requests DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1199] [30/Oct/2020:23:33:21.540366767 -0400] - INFO - slapd_daemon - Listening on /var/run/slapd-master1.socket for LDAPI requests DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1358] [30/Oct/2020:23:33:21.573080804 -0400] - INFO - postop_modify_config_dse - The change of nsslapd-securePort will not take effect until the server is restarted DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1475] [30/Oct/2020:23:33:21.586201376 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1653] [30/Oct/2020:23:33:22.403916144 -0400] - INFO - op_thread_cleanup - slapd shutting down - signaling operation threads - op stack size 2 max work q size 1 max work q stack size 1 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1783] [30/Oct/2020:23:33:22.412326930 -0400] - INFO - slapd_daemon - slapd shutting down - closing down internal subsystems and plugins DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1886] [30/Oct/2020:23:33:22.424413862 -0400] - INFO - bdb_pre_close - Waiting for 4 database threads to stop DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [1983] [30/Oct/2020:23:33:23.984658304 -0400] - INFO - bdb_pre_close - All database threads now stopped DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2094] [30/Oct/2020:23:33:24.026628203 -0400] - INFO - ldbm_back_instance_set_destructor - Set of instances destroyed DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2255] [30/Oct/2020:23:33:24.038303082 -0400] - INFO - connection_post_shutdown_cleanup - slapd shutting down - freed 1 work q stack objects - freed 2 op stack objects DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2325] [30/Oct/2020:23:33:24.045351882 -0400] - INFO - main - slapd stopped. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2429] [30/Oct/2020:23:33:25.502825480 -0400] - INFO - main - 389-Directory/2.0.0.0 B2020.305.0012 starting up DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2537] [30/Oct/2020:23:33:25.516711792 -0400] - INFO - main - Setting the maximum file descriptor limit to: 524288 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2645] [30/Oct/2020:23:33:26.383408968 -0400] - INFO - PBKDF2_SHA256 - Based on CPU performance, chose 2048 rounds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2762] [30/Oct/2020:23:33:26.390503207 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2864] [30/Oct/2020:23:33:26.400895640 -0400] - NOTICE - bdb_start_autotune - found 7980868k physical memory DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [2960] [30/Oct/2020:23:33:26.407076386 -0400] - NOTICE - bdb_start_autotune - found 7314024k available DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3067] [30/Oct/2020:23:33:26.412216094 -0400] - NOTICE - bdb_start_autotune - cache autosizing: db cache: 498804k DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3197] [30/Oct/2020:23:33:26.416943995 -0400] - NOTICE - bdb_start_autotune - cache autosizing: userRoot entry cache (1 total): 1376256k DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3323] [30/Oct/2020:23:33:26.422980940 -0400] - NOTICE - bdb_start_autotune - cache autosizing: userRoot dn cache (1 total): 196608k DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3427] [30/Oct/2020:23:33:26.426346184 -0400] - NOTICE - bdb_start_autotune - total cache size: 1834683801 B; DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3563] [30/Oct/2020:23:33:26.534028189 -0400] - INFO - slapd_daemon - slapd started. Listening on All Interfaces port 39001 for LDAP requests DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3688] [30/Oct/2020:23:33:26.542309899 -0400] - INFO - slapd_daemon - Listening on /var/run/slapd-master1.socket for LDAPI requests DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [3876] [30/Oct/2020:23:33:37.505036436 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding the replication changelog RUV, this may take several minutes... DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4056] [30/Oct/2020:23:33:37.509203213 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding replication changelog RUV complete. Result 0 (Success) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4244] [30/Oct/2020:23:33:37.513481891 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding the replication changelog RUV, this may take several minutes... DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4424] [30/Oct/2020:23:33:37.520921135 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding replication changelog RUV complete. Result 0 (Success) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4595] [30/Oct/2020:23:33:38.071757308 -0400] - INFO - NSMMReplicationPlugin - repl5_tot_run - Beginning total update of replica "agmt="cn=temp_201" (ci-vm-10-0-139-146:39201)". DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4779] [30/Oct/2020:23:33:38.077634722 -0400] - NOTICE - NSMMReplicationPlugin - replica_subentry_check - Need to create replication keep alive entry <cn=repl keep alive 1,dc=example,dc=com> DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4924] [30/Oct/2020:23:33:38.081412559 -0400] - INFO - NSMMReplicationPlugin - replica_subentry_create - add dn: cn=repl keep alive 1,dc=example,dc=com DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4941] objectclass: top DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4967] objectclass: ldapsubentry DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [4997] objectclass: extensibleObject DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5019] cn: repl keep alive 1 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5206] [30/Oct/2020:23:33:40.661080423 -0400] - INFO - NSMMReplicationPlugin - repl5_tot_run - Finished total update of replica "agmt="cn=temp_201" (ci-vm-10-0-139-146:39201)". Sent 16 entries. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5206] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 5206
Passed suites/schema/schema_replication_test.py::test_schema_replication_two 11.56
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Extra OC Schema is pushed - (ticket 47721 allows to learn missing def) INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'2' (expected 3) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'3' (expected 4) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:289 test_schema_replication_two master_schema_csn=b'5f9cdb25000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:290 test_schema_replication_two consumer_schema_csn=b'5f9cdb25000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 5207 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5420] 30/Oct/2020:23:33:57.305851854 -0400] - ERR - NSMMReplicationPlugin - update_consumer_schema - [S] Schema agmt="cn=201" (ci-vm-10-0-139-146:39201) must not be overwritten (set replication log for additional info) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 5420
Passed suites/schema/schema_replication_test.py::test_schema_replication_three 9.47
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Extra OC Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'4' (expected 5) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'5' (expected 6) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:340 test_schema_replication_three master_schema_csn=b'5f9cdb2e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:341 test_schema_replication_three consumer_schema_csn=b'5f9cdb2e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 5421 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5421] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 5421
Passed suites/schema/schema_replication_test.py::test_schema_replication_four 9.52
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MUST: Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'6' (expected 7) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'7' (expected 8) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:389 test_schema_replication_four master_schema_csn=b'5f9cdb38000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:390 ctest_schema_replication_four onsumer_schema_csn=b'5f9cdb38000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 5422 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5422] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 5422
Passed suites/schema/schema_replication_test.py::test_schema_replication_five 11.90
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MUST: Schema is pushed - (fix for 47721) INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'8' (expected 9) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'9' (expected 10) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:452 test_schema_replication_five master_schema_csn=b'5f9cdb44000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:453 ctest_schema_replication_five onsumer_schema_csn=b'5f9cdb44000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 5423 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5528] /Oct/2020:23:34:28.186343387 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5650] [30/Oct/2020:23:34:28.246842124 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb400002:1604115264:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5771] [30/Oct/2020:23:34:28.252201303 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb440000:1604115268:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [5934] [30/Oct/2020:23:34:28.257702182 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb44000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6114] [30/Oct/2020:23:34:28.265033682 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0c0000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6359] [30/Oct/2020:23:34:28.273479488 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6534] [30/Oct/2020:23:34:28.278069626 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb44000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6672] [30/Oct/2020:23:34:28.282151619 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6802] [30/Oct/2020:23:34:28.286059894 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [6940] [30/Oct/2020:23:34:28.291277830 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7068] [30/Oct/2020:23:34:28.296315913 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7183] [30/Oct/2020:23:34:28.300306178 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7358] [30/Oct/2020:23:34:28.306690529 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7541] [30/Oct/2020:23:34:28.312809133 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7713] [30/Oct/2020:23:34:28.317367729 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [7881] [30/Oct/2020:23:34:28.322973912 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8063] [30/Oct/2020:23:34:28.327053089 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8247] [30/Oct/2020:23:34:28.331650962 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb44000000000000 / remotecsn:5f9cdb38000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8411] [30/Oct/2020:23:34:28.477163414 -0400] - DEBUG - schema_oc_compare_strict - Attribute telexNumber is not required in 'consumerNewOCA' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8565] [30/Oct/2020:23:34:28.498145263 -0400] - DEBUG - schema_oc_superset_check - Remote consumerNewOCA schema objectclasses is a superset of the received one. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8729] [30/Oct/2020:23:34:28.504240056 -0400] - DEBUG - schema_oc_compare_strict - Attribute telexNumber is not required in 'consumerNewOCA' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [8879] [30/Oct/2020:23:34:28.510355777 -0400] - DEBUG - schema_list_oc2learn - Add that unknown/extended objectclass consumerNewOCA (1.2.3.4.5.6.7.8.9.10.1) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [9165] [30/Oct/2020:23:34:28.515802896 -0400] - DEBUG - schema_oc_to_string - Replace (old[251]=( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP 'person' AUXILIARY MUST ( postalAddress $ preferredLocale ) MAY ( postalCode $ street ) X-ORIGIN 'blahblahblah' )) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [9478] [30/Oct/2020:23:34:28.522974157 -0400] - DEBUG - supplier_get_new_definitions - supplier takes objectclass: ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP person AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street ) X-ORIGIN 'user defined' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [9780] [30/Oct/2020:23:34:28.552811689 -0400] - DEBUG - modify_schema_prepare_mods - MOD[1] del (objectclasses): ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP 'person' AUXILIARY MUST ( postalAddress $ preferredLocale ) MAY ( postalCode $ street ) X-ORIGIN 'blahblahblah' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10091] [30/Oct/2020:23:34:28.558067605 -0400] - DEBUG - modify_schema_prepare_mods - MOD[0] add (objectclasses): ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP person AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street ) X-ORIGIN 'user defined' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10199] [30/Oct/2020:23:34:28.562425198 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10322] [30/Oct/2020:23:34:28.575866734 -0400] - DEBUG - modify_schema_internal_mod - Successfully learn objectclasses definitions DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10536] [30/Oct/2020:23:34:28.581144531 -0400] - ERR - NSMMReplicationPlugin - update_consumer_schema - [S] Schema agmt="cn=201" (ci-vm-10-0-139-146:39201) must not be overwritten (set replication log for additional info) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 10536
Passed suites/schema/schema_replication_test.py::test_schema_replication_six 9.88
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MUST: Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'10' (expected 11) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'11' (expected 12) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:508 test_schema_replication_six master_schema_csn=b'5f9cdb4e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:509 ctest_schema_replication_six onsumer_schema_csn=b'5f9cdb4e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 10537 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10673] 30/Oct/2020:23:34:28.585401951 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] schema definitions may have been learned DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10807] [30/Oct/2020:23:34:28.850511553 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb42000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [10992] [30/Oct/2020:23:34:28.854631811 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - Schema checking successful: ok to push the schema (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11107] [30/Oct/2020:23:34:29.083922185 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb440002:1604115268:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11229] [30/Oct/2020:23:34:29.089996594 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb440002:1604115268:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11350] [30/Oct/2020:23:34:29.094445104 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb450000:1604115269:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11531] [30/Oct/2020:23:34:29.098944033 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11687] [30/Oct/2020:23:34:29.104978448 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [11900] [30/Oct/2020:23:34:29.109724252 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb40000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [12081] [30/Oct/2020:23:34:29.114078148 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [12237] [30/Oct/2020:23:34:29.118967050 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [12450] [30/Oct/2020:23:34:29.123009186 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb44000000010000 5f9cdb44 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [12612] [30/Oct/2020:23:34:29.127585497 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6be700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [12839] [30/Oct/2020:23:34:29.132524721 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6be700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13135] [30/Oct/2020:23:34:29.139560076 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb44000000010000) csnBuf (5f9cdb40000000010000) csnConsumerMax (5f9cdb40000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13248] [30/Oct/2020:23:34:29.143737495 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb40000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13439] [30/Oct/2020:23:34:29.147888076 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb40000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13595] [30/Oct/2020:23:34:29.157187034 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13707] [30/Oct/2020:23:34:29.161419498 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [13839] [30/Oct/2020:23:34:29.165431834 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [14059] [30/Oct/2020:23:34:29.169357991 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb44000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [14191] [30/Oct/2020:23:34:29.182317148 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [14389] [30/Oct/2020:23:34:29.188672237 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [14704] [30/Oct/2020:23:34:29.193290764 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb44000000010000) csnMax (5f9cdb44000000010000) csnBuf (5f9cdb44000000010000) csnConsumerMax (5f9cdb44000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [14828] [30/Oct/2020:23:34:29.196912281 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15012] [30/Oct/2020:23:34:29.201020413 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15124] [30/Oct/2020:23:34:29.204871829 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15257] [30/Oct/2020:23:34:29.211517145 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15387] [30/Oct/2020:23:34:29.219942798 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 19, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15520] [30/Oct/2020:23:34:29.224895567 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15653] [30/Oct/2020:23:34:29.231046702 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15786] [30/Oct/2020:23:34:29.236434401 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [15919] [30/Oct/2020:23:34:29.245346594 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16052] [30/Oct/2020:23:34:29.257557858 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16185] [30/Oct/2020:23:34:29.277900086 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16298] [30/Oct/2020:23:34:29.311732446 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 19 19 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16406] [30/Oct/2020:23:34:29.315827019 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16515] [30/Oct/2020:23:34:29.323906797 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16686] [30/Oct/2020:23:34:29.328141378 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now disabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [16958] [30/Oct/2020:23:34:29.332062871 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17122] [30/Oct/2020:23:34:29.344467048 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17292] [30/Oct/2020:23:34:29.349056785 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17466] [30/Oct/2020:23:34:29.354040446 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17641] [30/Oct/2020:23:34:29.358385901 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17813] [30/Oct/2020:23:34:29.364340747 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [17988] [30/Oct/2020:23:34:29.369094928 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18153] [30/Oct/2020:23:34:29.432267945 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_stop - agmt="cn=201" (ci-vm-10-0-139-146:39201): Protocol stopped after 0 seconds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18281] [30/Oct/2020:23:34:29.439821427 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18466] [30/Oct/2020:23:34:29.452157905 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb44000000010000 5f9cdb44 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18641] [30/Oct/2020:23:34:29.461618026 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18749] [30/Oct/2020:23:34:29.465448396 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [18857] [30/Oct/2020:23:34:34.488786580 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [19027] [30/Oct/2020:23:34:34.505642667 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now enabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [19202] [30/Oct/2020:23:34:34.515176444 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [19377] [30/Oct/2020:23:34:34.519239728 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [19549] [30/Oct/2020:23:34:34.524698038 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: start -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [19718] [30/Oct/2020:23:34:34.529070099 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Trying non-secure slapi_ldap_init_ext DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20246] [30/Oct/2020:23:34:34.536487227 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - binddn = cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com, passwd = {AES-TUhNR0NTcUdTSWIzRFFFRkRUQm1NRVVHQ1NxR1NJYjNEUUVGRERBNEJDUmtaVFV5WlRsbVl5MWtORGhrTTJJdw0KTlMxaFl6VTNNamcwT1Mxak1XVm1aVEEyTlFBQ0FRSUNBU0F3Q2dZSUtvWklodmNOQWdjd0hRWUpZSVpJQVdVRA0KQkFFcUJCQm5MNVhSZWZLb2IwdEZOWDcwWklqZw==}sN+DIEdLHPeXyJgvw17eEySjTpZdrQ10sPM+rT39tkvclGGF7OHrp9/uJmsKBd6t DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20291] Imc7SLxVIB5pzs0VGR2mbWzANQ+AAE2PgoJE5Vl5OWU= DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20466] [30/Oct/2020:23:34:34.545435101 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20588] [30/Oct/2020:23:34:34.554928781 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb450000:1604115269:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20709] [30/Oct/2020:23:34:34.560735116 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb4a0000:1604115274:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [20877] [30/Oct/2020:23:34:34.566473193 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21059] [30/Oct/2020:23:34:34.571412640 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21243] [30/Oct/2020:23:34:34.576760933 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb44000000000000 / remotecsn:5f9cdb42000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21377] [30/Oct/2020:23:34:34.828182777 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb44000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21492] [30/Oct/2020:23:34:34.833511119 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb4a0001:1604115274:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21673] [30/Oct/2020:23:34:34.838438389 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [21829] [30/Oct/2020:23:34:34.844153257 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [22042] [30/Oct/2020:23:34:34.849381099 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb44000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [22223] [30/Oct/2020:23:34:34.853012101 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [22379] [30/Oct/2020:23:34:34.856693544 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [22592] [30/Oct/2020:23:34:34.860516441 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb44000000010000 5f9cdb44 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [22888] [30/Oct/2020:23:34:34.866426268 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax () csnMax (5f9cdb44000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb44000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23012] [30/Oct/2020:23:34:34.872223282 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23284] [30/Oct/2020:23:34:34.881509561 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=0 sent=0 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23433] [30/Oct/2020:23:34:34.885556860 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No changes to send DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23597] [30/Oct/2020:23:34:34.892741232 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23767] [30/Oct/2020:23:34:34.897107716 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [23941] [30/Oct/2020:23:34:34.900887428 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24063] [30/Oct/2020:23:34:36.524656186 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb4a0001:1604115274:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24184] [30/Oct/2020:23:34:36.537798085 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb4c0000:1604115276:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24347] [30/Oct/2020:23:34:36.542454014 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb4c000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24527] [30/Oct/2020:23:34:36.547287463 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0c4000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24772] [30/Oct/2020:23:34:36.553123961 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [24947] [30/Oct/2020:23:34:36.561541993 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb4c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25085] [30/Oct/2020:23:34:36.566855355 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25215] [30/Oct/2020:23:34:36.574693319 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25353] [30/Oct/2020:23:34:36.579022714 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25481] [30/Oct/2020:23:34:36.585135204 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25596] [30/Oct/2020:23:34:36.591241253 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25771] [30/Oct/2020:23:34:36.600760367 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [25954] [30/Oct/2020:23:34:36.613852725 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26126] [30/Oct/2020:23:34:36.618417292 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26294] [30/Oct/2020:23:34:36.624626217 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26476] [30/Oct/2020:23:34:36.629197770 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26591] [30/Oct/2020:23:34:36.634121404 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb4c0002:1604115276:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26772] [30/Oct/2020:23:34:36.639296040 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [26928] [30/Oct/2020:23:34:36.648618716 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [27141] [30/Oct/2020:23:34:36.652469844 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb44000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [27322] [30/Oct/2020:23:34:36.657231443 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [27478] [30/Oct/2020:23:34:36.667631402 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [27691] [30/Oct/2020:23:34:36.672973250 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4c000000010000 5f9cdb4c DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [27853] [30/Oct/2020:23:34:36.681829880 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8500 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [28080] [30/Oct/2020:23:34:36.712394255 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8500 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [28376] [30/Oct/2020:23:34:36.717113253 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb4c000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb44000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [28489] [30/Oct/2020:23:34:36.721337257 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb44000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [28680] [30/Oct/2020:23:34:36.725540752 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb44000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [28836] [30/Oct/2020:23:34:36.734344101 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29056] [30/Oct/2020:23:34:36.739101894 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb4c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29168] [30/Oct/2020:23:34:36.752616919 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29366] [30/Oct/2020:23:34:36.762203167 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29681] [30/Oct/2020:23:34:36.773473913 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb4c000000010000) csnMax (5f9cdb4c000000010000) csnBuf (5f9cdb4c000000010000) csnConsumerMax (5f9cdb4c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29805] [30/Oct/2020:23:34:36.784676990 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [29989] [30/Oct/2020:23:34:36.790597671 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30101] [30/Oct/2020:23:34:36.795484608 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30233] [30/Oct/2020:23:34:36.800638600 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30366] [30/Oct/2020:23:34:36.809022472 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30496] [30/Oct/2020:23:34:36.814600361 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 10, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30629] [30/Oct/2020:23:34:36.821749302 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30762] [30/Oct/2020:23:34:36.826627290 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [30895] [30/Oct/2020:23:34:36.836658038 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31028] [30/Oct/2020:23:34:36.845451775 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31161] [30/Oct/2020:23:34:36.864710320 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31294] [30/Oct/2020:23:34:36.885408071 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31407] [30/Oct/2020:23:34:36.900752535 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 10 10 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31516] [30/Oct/2020:23:34:36.922056447 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31788] [30/Oct/2020:23:34:36.929650580 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [31952] [30/Oct/2020:23:34:36.942161303 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32122] [30/Oct/2020:23:34:36.947066735 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32296] [30/Oct/2020:23:34:36.952192935 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32404] [30/Oct/2020:23:34:37.933575282 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32512] [30/Oct/2020:23:34:38.178649769 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32634] [30/Oct/2020:23:34:38.198377760 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb4c0002:1604115276:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32755] [30/Oct/2020:23:34:38.204925644 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb4e0000:1604115278:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [32918] [30/Oct/2020:23:34:38.209506200 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb4e000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33098] [30/Oct/2020:23:34:38.218477142 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0cc000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33343] [30/Oct/2020:23:34:38.223734727 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33518] [30/Oct/2020:23:34:38.228869369 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb4e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33656] [30/Oct/2020:23:34:38.236625817 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33786] [30/Oct/2020:23:34:38.242265895 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [33924] [30/Oct/2020:23:34:38.247478577 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34052] [30/Oct/2020:23:34:38.251935168 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34167] [30/Oct/2020:23:34:38.257296000 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34342] [30/Oct/2020:23:34:38.266309708 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34525] [30/Oct/2020:23:34:38.270741655 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34697] [30/Oct/2020:23:34:38.275072274 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [34865] [30/Oct/2020:23:34:38.280968756 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35047] [30/Oct/2020:23:34:38.285307220 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35231] [30/Oct/2020:23:34:38.290132371 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb4e000000000000 / remotecsn:5f9cdb44000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35365] [30/Oct/2020:23:34:38.521776432 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb44000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35550] [30/Oct/2020:23:34:38.527277975 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - Schema checking successful: ok to push the schema (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35665] [30/Oct/2020:23:34:38.862177953 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb4e0002:1604115278:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [35846] [30/Oct/2020:23:34:38.868920454 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36002] [30/Oct/2020:23:34:38.875578454 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36215] [30/Oct/2020:23:34:38.880521499 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4c000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36396] [30/Oct/2020:23:34:38.888387026 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36552] [30/Oct/2020:23:34:38.894221294 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36765] [30/Oct/2020:23:34:38.910071284 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4e000000010000 5f9cdb4e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [36927] [30/Oct/2020:23:34:38.915512412 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8500 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [37154] [30/Oct/2020:23:34:38.922555018 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8500 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [37450] [30/Oct/2020:23:34:38.935065176 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb4e000000010000) csnBuf (5f9cdb4c000000010000) csnConsumerMax (5f9cdb4c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [37563] [30/Oct/2020:23:34:38.941888326 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb4c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [37754] [30/Oct/2020:23:34:38.946891136 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb4c000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [37910] [30/Oct/2020:23:34:38.957771769 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38022] [30/Oct/2020:23:34:38.963236101 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38154] [30/Oct/2020:23:34:38.973613678 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38374] [30/Oct/2020:23:34:38.986538058 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb4e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38506] [30/Oct/2020:23:34:38.992847375 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [38704] [30/Oct/2020:23:34:39.000218243 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39019] [30/Oct/2020:23:34:39.004590782 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb4e000000010000) csnMax (5f9cdb4e000000010000) csnBuf (5f9cdb4e000000010000) csnConsumerMax (5f9cdb4e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39143] [30/Oct/2020:23:34:39.009359249 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39327] [30/Oct/2020:23:34:39.016338825 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39439] [30/Oct/2020:23:34:39.024562771 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39572] [30/Oct/2020:23:34:39.033190513 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39702] [30/Oct/2020:23:34:39.043683432 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 17, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39835] [30/Oct/2020:23:34:39.048382341 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [39968] [30/Oct/2020:23:34:39.061522754 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40101] [30/Oct/2020:23:34:39.069422954 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40234] [30/Oct/2020:23:34:39.085422069 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40367] [30/Oct/2020:23:34:39.098508363 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40500] [30/Oct/2020:23:34:39.119678752 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40613] [30/Oct/2020:23:34:39.133326573 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 17 17 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40722] [30/Oct/2020:23:34:39.157698698 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [40994] [30/Oct/2020:23:34:39.167386350 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41158] [30/Oct/2020:23:34:39.177497241 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41328] [30/Oct/2020:23:34:39.184855274 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41502] [30/Oct/2020:23:34:39.199001689 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41610] [30/Oct/2020:23:34:39.205920654 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41725] [30/Oct/2020:23:34:39.215517151 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [41833] [30/Oct/2020:23:34:39.273707075 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42004] [30/Oct/2020:23:34:39.278735699 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now disabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42179] [30/Oct/2020:23:34:39.283302799 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42354] [30/Oct/2020:23:34:39.291058925 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42526] [30/Oct/2020:23:34:39.301453047 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42701] [30/Oct/2020:23:34:39.313219888 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42866] [30/Oct/2020:23:34:39.383446971 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_stop - agmt="cn=201" (ci-vm-10-0-139-146:39201): Protocol stopped after 0 seconds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [42994] [30/Oct/2020:23:34:39.392101520 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43179] [30/Oct/2020:23:34:39.398129104 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4e000000010000 5f9cdb4e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43354] [30/Oct/2020:23:34:39.406159463 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43462] [30/Oct/2020:23:34:39.410523172 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43570] [30/Oct/2020:23:34:44.431444933 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43740] [30/Oct/2020:23:34:44.442255168 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now enabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [43915] [30/Oct/2020:23:34:44.448054283 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [44090] [30/Oct/2020:23:34:44.457881657 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [44262] [30/Oct/2020:23:34:44.463109356 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: start -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [44431] [30/Oct/2020:23:34:44.471690901 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Trying non-secure slapi_ldap_init_ext DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [44959] [30/Oct/2020:23:34:44.475846794 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - binddn = cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com, passwd = {AES-TUhNR0NTcUdTSWIzRFFFRkRUQm1NRVVHQ1NxR1NJYjNEUUVGRERBNEJDUmtaVFV5WlRsbVl5MWtORGhrTTJJdw0KTlMxaFl6VTNNamcwT1Mxak1XVm1aVEEyTlFBQ0FRSUNBU0F3Q2dZSUtvWklodmNOQWdjd0hRWUpZSVpJQVdVRA0KQkFFcUJCQm5MNVhSZWZLb2IwdEZOWDcwWklqZw==}sN+DIEdLHPeXyJgvw17eEySjTpZdrQ10sPM+rT39tkvclGGF7OHrp9/uJmsKBd6t DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45004] Imc7SLxVIB5pzs0VGR2mbWzANQ+AAE2PgoJE5Vl5OWU= DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45179] [30/Oct/2020:23:34:44.482551309 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45301] [30/Oct/2020:23:34:44.488901424 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb4e0002:1604115278:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45422] [30/Oct/2020:23:34:44.498001785 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb540000:1604115284:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45590] [30/Oct/2020:23:34:44.506156600 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45772] [30/Oct/2020:23:34:44.510966273 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [45956] [30/Oct/2020:23:34:44.517228529 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb4e000000000000 / remotecsn:5f9cdb44000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46090] [30/Oct/2020:23:34:44.798716357 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb4e000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46205] [30/Oct/2020:23:34:44.805233611 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb540001:1604115284:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46386] [30/Oct/2020:23:34:44.818675442 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46542] [30/Oct/2020:23:34:44.824836272 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46755] [30/Oct/2020:23:34:44.829324936 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4e000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [46936] [30/Oct/2020:23:34:44.833593435 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [47092] [30/Oct/2020:23:34:44.840673810 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [47305] [30/Oct/2020:23:34:44.857381669 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4e000000010000 5f9cdb4e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [47601] [30/Oct/2020:23:34:44.862571550 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax () csnMax (5f9cdb4e000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb4e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [47725] [30/Oct/2020:23:34:44.867633218 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [47997] [30/Oct/2020:23:34:44.872161283 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=0 sent=0 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48146] [30/Oct/2020:23:34:44.879139827 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No changes to send DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48310] [30/Oct/2020:23:34:44.891269709 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48480] [30/Oct/2020:23:34:44.895456416 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48654] [30/Oct/2020:23:34:44.899408642 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48776] [30/Oct/2020:23:34:46.458216585 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb540001:1604115284:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [48897] [30/Oct/2020:23:34:46.462855120 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb560000:1604115286:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49060] [30/Oct/2020:23:34:46.467663973 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb56000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49240] [30/Oct/2020:23:34:46.472119483 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0ce000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49485] [30/Oct/2020:23:34:46.481767506 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49660] [30/Oct/2020:23:34:46.486372967 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb56000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49798] [30/Oct/2020:23:34:46.490710411 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [49928] [30/Oct/2020:23:34:46.494778137 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50066] [30/Oct/2020:23:34:46.501432626 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50194] [30/Oct/2020:23:34:46.510747785 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50309] [30/Oct/2020:23:34:46.518860494 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50484] [30/Oct/2020:23:34:46.530988175 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50667] [30/Oct/2020:23:34:46.535581925 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [50839] [30/Oct/2020:23:34:46.557323280 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51007] [30/Oct/2020:23:34:46.567358727 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51189] [30/Oct/2020:23:34:46.573014731 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51304] [30/Oct/2020:23:34:46.583632086 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb560002:1604115286:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51485] [30/Oct/2020:23:34:46.594520966 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51641] [30/Oct/2020:23:34:46.601861573 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [51854] [30/Oct/2020:23:34:46.611654173 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb4e000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [52035] [30/Oct/2020:23:34:46.617899403 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [52191] [30/Oct/2020:23:34:46.623795176 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [52404] [30/Oct/2020:23:34:46.639147722 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb56000000010000 5f9cdb56 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [52566] [30/Oct/2020:23:34:46.644705935 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [52793] [30/Oct/2020:23:34:46.650705290 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53089] [30/Oct/2020:23:34:46.655863317 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb56000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb4e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53202] [30/Oct/2020:23:34:46.658759200 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb4e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53393] [30/Oct/2020:23:34:46.662237251 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb4e000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53549] [30/Oct/2020:23:34:46.666801268 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53661] [30/Oct/2020:23:34:46.671126407 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [53793] [30/Oct/2020:23:34:46.674883208 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54013] [30/Oct/2020:23:34:46.678314545 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb56000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54145] [30/Oct/2020:23:34:46.681710882 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54343] [30/Oct/2020:23:34:46.685457864 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54658] [30/Oct/2020:23:34:46.689971800 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb56000000010000) csnMax (5f9cdb56000000010000) csnBuf (5f9cdb56000000010000) csnConsumerMax (5f9cdb56000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54782] [30/Oct/2020:23:34:46.699602335 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [54966] [30/Oct/2020:23:34:46.702936622 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55078] [30/Oct/2020:23:34:46.707241869 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55211] [30/Oct/2020:23:34:46.711755472 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55341] [30/Oct/2020:23:34:46.715187761 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 11, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55474] [30/Oct/2020:23:34:46.719123477 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55607] [30/Oct/2020:23:34:46.726616705 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55740] [30/Oct/2020:23:34:46.732442499 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [55873] [30/Oct/2020:23:34:46.739894552 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56006] [30/Oct/2020:23:34:46.752062780 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56139] [30/Oct/2020:23:34:46.772369585 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56272] [30/Oct/2020:23:34:46.808655471 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56385] [30/Oct/2020:23:34:46.812059756 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 11 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56494] [30/Oct/2020:23:34:46.876230149 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56766] [30/Oct/2020:23:34:46.880293913 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [56930] [30/Oct/2020:23:34:46.887558336 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57100] [30/Oct/2020:23:34:46.891316363 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57274] [30/Oct/2020:23:34:46.895866141 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57274] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 57274
Passed suites/schema/schema_replication_test.py::test_schema_replication_seven 9.75
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MAY: Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'12' (expected 13) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'13' (expected 14) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:562 test_schema_replication_seven master_schema_csn=b'5f9cdb58000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:563 ctest_schema_replication_seven consumer_schema_csn=b'5f9cdb58000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 57275 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57382] 30/Oct/2020:23:34:47.952565373 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57490] [30/Oct/2020:23:34:48.060352681 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57612] [30/Oct/2020:23:34:48.085526056 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb560002:1604115286:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57733] [30/Oct/2020:23:34:48.096029888 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb580000:1604115288:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [57896] [30/Oct/2020:23:34:48.101180977 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb58000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58076] [30/Oct/2020:23:34:48.106768206 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0d6000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58321] [30/Oct/2020:23:34:48.114359802 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58496] [30/Oct/2020:23:34:48.120402197 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb58000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58634] [30/Oct/2020:23:34:48.125008091 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58764] [30/Oct/2020:23:34:48.129003625 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [58902] [30/Oct/2020:23:34:48.132887424 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59030] [30/Oct/2020:23:34:48.139797885 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59145] [30/Oct/2020:23:34:48.146448937 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59320] [30/Oct/2020:23:34:48.152883387 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59503] [30/Oct/2020:23:34:48.157860627 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59675] [30/Oct/2020:23:34:48.161456869 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [59843] [30/Oct/2020:23:34:48.172698270 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60025] [30/Oct/2020:23:34:48.178747511 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60209] [30/Oct/2020:23:34:48.182647050 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb58000000000000 / remotecsn:5f9cdb4e000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60372] [30/Oct/2020:23:34:48.355900657 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'masterNewOCA' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60506] [30/Oct/2020:23:34:48.448267212 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb4e000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60691] [30/Oct/2020:23:34:48.451784344 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - Schema checking successful: ok to push the schema (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60806] [30/Oct/2020:23:34:48.788975127 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb580002:1604115288:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [60987] [30/Oct/2020:23:34:48.814634543 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [61143] [30/Oct/2020:23:34:48.823385871 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [61356] [30/Oct/2020:23:34:48.827376020 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb56000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [61537] [30/Oct/2020:23:34:48.833587405 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [61693] [30/Oct/2020:23:34:48.838762846 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [61906] [30/Oct/2020:23:34:48.853670436 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb58000000010000 5f9cdb58 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [62068] [30/Oct/2020:23:34:48.860471404 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [62295] [30/Oct/2020:23:34:48.866118441 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8700 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [62591] [30/Oct/2020:23:34:48.870927340 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb58000000010000) csnBuf (5f9cdb56000000010000) csnConsumerMax (5f9cdb56000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [62704] [30/Oct/2020:23:34:48.875816143 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb56000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [62895] [30/Oct/2020:23:34:48.882626029 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb56000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [63051] [30/Oct/2020:23:34:48.905211700 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [63271] [30/Oct/2020:23:34:48.909925113 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb58000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [63383] [30/Oct/2020:23:34:48.914198913 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [63581] [30/Oct/2020:23:34:48.918229242 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [63896] [30/Oct/2020:23:34:48.923458828 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb58000000010000) csnMax (5f9cdb58000000010000) csnBuf (5f9cdb58000000010000) csnConsumerMax (5f9cdb58000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64020] [30/Oct/2020:23:34:48.927622200 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64204] [30/Oct/2020:23:34:48.934035934 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64316] [30/Oct/2020:23:34:48.939341294 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64448] [30/Oct/2020:23:34:48.943596975 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64581] [30/Oct/2020:23:34:48.951527594 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64711] [30/Oct/2020:23:34:48.959344700 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 18, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64844] [30/Oct/2020:23:34:48.963780423 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [64977] [30/Oct/2020:23:34:48.969570464 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65110] [30/Oct/2020:23:34:48.976710292 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65243] [30/Oct/2020:23:34:48.985694382 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65376] [30/Oct/2020:23:34:48.997600973 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65509] [30/Oct/2020:23:34:49.024594991 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65622] [30/Oct/2020:23:34:49.045626340 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 18 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [65731] [30/Oct/2020:23:34:49.061824486 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66003] [30/Oct/2020:23:34:49.068506375 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66167] [30/Oct/2020:23:34:49.075735038 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66337] [30/Oct/2020:23:34:49.079633727 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66511] [30/Oct/2020:23:34:49.083499201 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66619] [30/Oct/2020:23:34:49.159005127 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66790] [30/Oct/2020:23:34:49.163336221 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now disabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [66965] [30/Oct/2020:23:34:49.167106256 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67140] [30/Oct/2020:23:34:49.170310827 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67312] [30/Oct/2020:23:34:49.173821637 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67487] [30/Oct/2020:23:34:49.181563942 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67652] [30/Oct/2020:23:34:49.267264711 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_stop - agmt="cn=201" (ci-vm-10-0-139-146:39201): Protocol stopped after 0 seconds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67780] [30/Oct/2020:23:34:49.271978991 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [67965] [30/Oct/2020:23:34:49.275704431 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb58000000010000 5f9cdb58 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68140] [30/Oct/2020:23:34:49.280845719 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68248] [30/Oct/2020:23:34:49.285787349 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68356] [30/Oct/2020:23:34:54.305646294 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68526] [30/Oct/2020:23:34:54.312312108 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now enabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68701] [30/Oct/2020:23:34:54.318696929 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [68876] [30/Oct/2020:23:34:54.327875431 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [69048] [30/Oct/2020:23:34:54.334831914 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: start -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [69217] [30/Oct/2020:23:34:54.346056889 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Trying non-secure slapi_ldap_init_ext DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [69745] [30/Oct/2020:23:34:54.349940366 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - binddn = cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com, passwd = {AES-TUhNR0NTcUdTSWIzRFFFRkRUQm1NRVVHQ1NxR1NJYjNEUUVGRERBNEJDUmtaVFV5WlRsbVl5MWtORGhrTTJJdw0KTlMxaFl6VTNNamcwT1Mxak1XVm1aVEEyTlFBQ0FRSUNBU0F3Q2dZSUtvWklodmNOQWdjd0hRWUpZSVpJQVdVRA0KQkFFcUJCQm5MNVhSZWZLb2IwdEZOWDcwWklqZw==}sN+DIEdLHPeXyJgvw17eEySjTpZdrQ10sPM+rT39tkvclGGF7OHrp9/uJmsKBd6t DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [69790] Imc7SLxVIB5pzs0VGR2mbWzANQ+AAE2PgoJE5Vl5OWU= DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [69965] [30/Oct/2020:23:34:54.354735385 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70087] [30/Oct/2020:23:34:54.363466819 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb580002:1604115288:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70208] [30/Oct/2020:23:34:54.367527366 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb5e0000:1604115294:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70376] [30/Oct/2020:23:34:54.372532784 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70558] [30/Oct/2020:23:34:54.376309069 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70742] [30/Oct/2020:23:34:54.386063745 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb58000000000000 / remotecsn:5f9cdb4e000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70876] [30/Oct/2020:23:34:54.625779547 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb58000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [70991] [30/Oct/2020:23:34:54.630657936 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb5e0001:1604115294:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [71172] [30/Oct/2020:23:34:54.635758318 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [71328] [30/Oct/2020:23:34:54.640605961 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [71541] [30/Oct/2020:23:34:54.647729715 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb58000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [71722] [30/Oct/2020:23:34:54.652558823 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [71878] [30/Oct/2020:23:34:54.662385580 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [72091] [30/Oct/2020:23:34:54.669566458 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb58000000010000 5f9cdb58 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [72387] [30/Oct/2020:23:34:54.674111606 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax () csnMax (5f9cdb58000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb58000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [72511] [30/Oct/2020:23:34:54.681571853 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [72783] [30/Oct/2020:23:34:54.686199760 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=0 sent=0 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [72932] [30/Oct/2020:23:34:54.694838326 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No changes to send DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73096] [30/Oct/2020:23:34:54.709070079 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73266] [30/Oct/2020:23:34:54.713705610 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73440] [30/Oct/2020:23:34:54.719063728 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73562] [30/Oct/2020:23:34:56.328831384 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb5e0001:1604115294:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73683] [30/Oct/2020:23:34:56.333434657 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb600000:1604115296:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [73846] [30/Oct/2020:23:34:56.338055650 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb60000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74026] [30/Oct/2020:23:34:56.346845582 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0d8000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74271] [30/Oct/2020:23:34:56.351555256 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74446] [30/Oct/2020:23:34:56.357556025 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb60000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74584] [30/Oct/2020:23:34:56.368669651 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74714] [30/Oct/2020:23:34:56.373347446 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74852] [30/Oct/2020:23:34:56.383577749 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [74980] [30/Oct/2020:23:34:56.388303175 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75095] [30/Oct/2020:23:34:56.392639786 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75270] [30/Oct/2020:23:34:56.418669476 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75453] [30/Oct/2020:23:34:56.423405346 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75625] [30/Oct/2020:23:34:56.427798617 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75793] [30/Oct/2020:23:34:56.433482560 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [75975] [30/Oct/2020:23:34:56.437104078 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76090] [30/Oct/2020:23:34:56.440807843 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb600002:1604115296:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76271] [30/Oct/2020:23:34:56.446411340 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76427] [30/Oct/2020:23:34:56.452822819 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76640] [30/Oct/2020:23:34:56.458072291 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb58000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76821] [30/Oct/2020:23:34:56.461434329 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [76977] [30/Oct/2020:23:34:56.464816574 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [77190] [30/Oct/2020:23:34:56.468674157 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb60000000010000 5f9cdb60 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [77352] [30/Oct/2020:23:34:56.471911666 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8600 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [77579] [30/Oct/2020:23:34:56.475365600 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8600 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [77875] [30/Oct/2020:23:34:56.478905673 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb60000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb58000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [77988] [30/Oct/2020:23:34:56.484564068 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb58000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78179] [30/Oct/2020:23:34:56.489428929 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb58000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78335] [30/Oct/2020:23:34:56.492901579 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78447] [30/Oct/2020:23:34:56.497640309 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78579] [30/Oct/2020:23:34:56.502944091 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78799] [30/Oct/2020:23:34:56.508033555 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb60000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [78931] [30/Oct/2020:23:34:56.519502062 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79129] [30/Oct/2020:23:34:56.524303550 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79444] [30/Oct/2020:23:34:56.531377764 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb60000000010000) csnMax (5f9cdb60000000010000) csnBuf (5f9cdb60000000010000) csnConsumerMax (5f9cdb60000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79568] [30/Oct/2020:23:34:56.535509621 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79752] [30/Oct/2020:23:34:56.539142242 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79864] [30/Oct/2020:23:34:56.545357283 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [79996] [30/Oct/2020:23:34:56.553703063 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80129] [30/Oct/2020:23:34:56.561532644 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80259] [30/Oct/2020:23:34:56.565270836 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 11, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80392] [30/Oct/2020:23:34:56.568996217 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80525] [30/Oct/2020:23:34:56.573234336 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80658] [30/Oct/2020:23:34:56.582687582 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80791] [30/Oct/2020:23:34:56.591341253 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [80924] [30/Oct/2020:23:34:56.604546247 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81057] [30/Oct/2020:23:34:56.626209557 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81170] [30/Oct/2020:23:34:56.654012388 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 11 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81279] [30/Oct/2020:23:34:56.663254362 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81551] [30/Oct/2020:23:34:56.670858860 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81715] [30/Oct/2020:23:34:56.686607055 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [81885] [30/Oct/2020:23:34:56.691136653 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82059] [30/Oct/2020:23:34:56.695185291 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82059] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 82059
Passed suites/schema/schema_replication_test.py::test_schema_replication_eight 12.23
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MAY: Schema is pushed (fix for 47721) INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'14' (expected 15) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'15' (expected 16) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:625 test_schema_replication_eight master_schema_csn=b'5f9cdb64000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:626 ctest_schema_replication_eight onsumer_schema_csn=b'5f9cdb64000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 82060 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82167] 30/Oct/2020:23:35:00.099913867 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82275] [30/Oct/2020:23:35:00.217488078 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82397] [30/Oct/2020:23:35:00.235263006 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb600002:1604115296:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82518] [30/Oct/2020:23:35:00.255486717 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb640000:1604115300:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82681] [30/Oct/2020:23:35:00.260008250 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb64000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [82861] [30/Oct/2020:23:35:00.263976061 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0e0000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83106] [30/Oct/2020:23:35:00.268150878 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83281] [30/Oct/2020:23:35:00.271804358 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb64000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83419] [30/Oct/2020:23:35:00.275329496 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83549] [30/Oct/2020:23:35:00.278856585 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83687] [30/Oct/2020:23:35:00.283039122 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83815] [30/Oct/2020:23:35:00.286422686 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [83930] [30/Oct/2020:23:35:00.291333581 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84105] [30/Oct/2020:23:35:00.297506951 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84288] [30/Oct/2020:23:35:00.302761215 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84460] [30/Oct/2020:23:35:00.308155066 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84628] [30/Oct/2020:23:35:00.314460132 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84810] [30/Oct/2020:23:35:00.319008907 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [84994] [30/Oct/2020:23:35:00.325337935 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb64000000000000 / remotecsn:5f9cdb58000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85157] [30/Oct/2020:23:35:00.493605966 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'masterNewOCC' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85322] [30/Oct/2020:23:35:00.498602636 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'consumerNewOCA' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85476] [30/Oct/2020:23:35:00.505481025 -0400] - DEBUG - schema_oc_superset_check - Remote consumerNewOCA schema objectclasses is a superset of the received one. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85639] [30/Oct/2020:23:35:00.513017093 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'masterNewOCC' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85804] [30/Oct/2020:23:35:00.528036556 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'consumerNewOCA' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [85954] [30/Oct/2020:23:35:00.537315738 -0400] - DEBUG - schema_list_oc2learn - Add that unknown/extended objectclass consumerNewOCA (1.2.3.4.5.6.7.8.9.10.1) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [86254] [30/Oct/2020:23:35:00.556118969 -0400] - DEBUG - schema_oc_to_string - Replace (old[265]=( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP 'person' AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street ) X-ORIGIN 'blahblahblah' )) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [86583] [30/Oct/2020:23:35:00.560857724 -0400] - DEBUG - supplier_get_new_definitions - supplier takes objectclass: ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP person AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street $ postOfficeBox ) X-ORIGIN 'user defined' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [86899] [30/Oct/2020:23:35:00.579972921 -0400] - DEBUG - modify_schema_prepare_mods - MOD[1] del (objectclasses): ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP 'person' AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street ) X-ORIGIN 'blahblahblah' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87226] [30/Oct/2020:23:35:00.583852539 -0400] - DEBUG - modify_schema_prepare_mods - MOD[0] add (objectclasses): ( 1.2.3.4.5.6.7.8.9.10.1 NAME 'consumerNewOCA' DESC 'To test ticket 47490' SUP person AUXILIARY MUST ( postalAddress $ preferredLocale $ telexNumber ) MAY ( postalCode $ street $ postOfficeBox ) X-ORIGIN 'user defined' ) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87334] [30/Oct/2020:23:35:00.590606667 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87457] [30/Oct/2020:23:35:00.596930221 -0400] - DEBUG - modify_schema_internal_mod - Successfully learn objectclasses definitions DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87671] [30/Oct/2020:23:35:00.601803082 -0400] - ERR - NSMMReplicationPlugin - update_consumer_schema - [S] Schema agmt="cn=201" (ci-vm-10-0-139-146:39201) must not be overwritten (set replication log for additional info) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 87671
Passed suites/schema/schema_replication_test.py::test_schema_replication_nine 9.75
-------------------------------Captured log call--------------------------------
INFO  lib389:schema_replication_test.py:41 ############################################### INFO  lib389:schema_replication_test.py:42 ####### INFO  lib389:schema_replication_test.py:43 ####### Same OC - extra MAY: Schema is pushed - no error INFO  lib389:schema_replication_test.py:44 ####### INFO  lib389:schema_replication_test.py:45 ################################################### DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'16' (expected 17) INFO  lib389:agreement.py:1194 Pausing replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=201,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:152 trigger_update: receive b'17' (expected 18) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:683 test_schema_replication_nine master_schema_csn=b'5f9cdb6e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:684 ctest_schema_replication_nine onsumer_schema_csn=b'5f9cdb6e000000000000' DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:55 _pattern_errorlog: start at offset 87672 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87808] 30/Oct/2020:23:35:00.609746939 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] schema definitions may have been learned DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [87971] [30/Oct/2020:23:35:00.787004766 -0400] - DEBUG - schema_oc_compare_strict - Attribute postOfficeBox is not allowed in 'masterNewOCC' of the remote consumer schema DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88105] [30/Oct/2020:23:35:00.883633711 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb61000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88290] [30/Oct/2020:23:35:00.888621928 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - Schema checking successful: ok to push the schema (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88405] [30/Oct/2020:23:35:01.101789933 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb640002:1604115300:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88527] [30/Oct/2020:23:35:01.109365611 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb640002:1604115300:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88648] [30/Oct/2020:23:35:01.115109813 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb650000:1604115301:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88829] [30/Oct/2020:23:35:01.120025365 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [88985] [30/Oct/2020:23:35:01.124502277 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [89198] [30/Oct/2020:23:35:01.128606428 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb60000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [89379] [30/Oct/2020:23:35:01.132604184 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [89535] [30/Oct/2020:23:35:01.137497249 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [89748] [30/Oct/2020:23:35:01.143367424 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb64000000010000 5f9cdb64 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [89910] [30/Oct/2020:23:35:01.148589576 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8600 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [90137] [30/Oct/2020:23:35:01.156599872 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8600 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [90433] [30/Oct/2020:23:35:01.161075582 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb64000000010000) csnBuf (5f9cdb60000000010000) csnConsumerMax (5f9cdb60000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [90546] [30/Oct/2020:23:35:01.166529408 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb60000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [90737] [30/Oct/2020:23:35:01.170706677 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb60000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [90893] [30/Oct/2020:23:35:01.175057693 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [91005] [30/Oct/2020:23:35:01.182224524 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [91137] [30/Oct/2020:23:35:01.187323522 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [91357] [30/Oct/2020:23:35:01.192938527 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb64000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [91489] [30/Oct/2020:23:35:01.200089832 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [91687] [30/Oct/2020:23:35:01.204461428 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92002] [30/Oct/2020:23:35:01.209172914 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb64000000010000) csnMax (5f9cdb64000000010000) csnBuf (5f9cdb64000000010000) csnConsumerMax (5f9cdb64000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92126] [30/Oct/2020:23:35:01.215488825 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92310] [30/Oct/2020:23:35:01.219313120 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92422] [30/Oct/2020:23:35:01.224194308 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92555] [30/Oct/2020:23:35:01.232628098 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92685] [30/Oct/2020:23:35:01.240525928 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 20, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92818] [30/Oct/2020:23:35:01.249443474 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [92951] [30/Oct/2020:23:35:01.255087864 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93084] [30/Oct/2020:23:35:01.266418289 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93217] [30/Oct/2020:23:35:01.279141657 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93350] [30/Oct/2020:23:35:01.292743915 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93458] [30/Oct/2020:23:35:01.304222975 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93629] [30/Oct/2020:23:35:01.311206581 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now disabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93762] [30/Oct/2020:23:35:01.316099292 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93875] [30/Oct/2020:23:35:01.332819871 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 20 20 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [93984] [30/Oct/2020:23:35:01.352293280 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [94256] [30/Oct/2020:23:35:01.356690747 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [94420] [30/Oct/2020:23:35:01.366546499 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [94590] [30/Oct/2020:23:35:01.374688837 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [94764] [30/Oct/2020:23:35:01.380178724 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [94939] [30/Oct/2020:23:35:01.384463305 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95111] [30/Oct/2020:23:35:01.391830940 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95286] [30/Oct/2020:23:35:01.397380549 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95451] [30/Oct/2020:23:35:01.416327964 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_stop - agmt="cn=201" (ci-vm-10-0-139-146:39201): Protocol stopped after 0 seconds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95579] [30/Oct/2020:23:35:01.420961797 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95764] [30/Oct/2020:23:35:01.425694000 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb64000000010000 5f9cdb64 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [95939] [30/Oct/2020:23:35:01.429669662 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96047] [30/Oct/2020:23:35:01.435249564 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96155] [30/Oct/2020:23:35:06.451836281 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96325] [30/Oct/2020:23:35:06.686138305 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now enabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96500] [30/Oct/2020:23:35:06.691066725 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96675] [30/Oct/2020:23:35:06.695510561 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [96847] [30/Oct/2020:23:35:06.700154441 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: start -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [97016] [30/Oct/2020:23:35:06.705053266 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Trying non-secure slapi_ldap_init_ext DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [97544] [30/Oct/2020:23:35:06.711266021 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - binddn = cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com, passwd = {AES-TUhNR0NTcUdTSWIzRFFFRkRUQm1NRVVHQ1NxR1NJYjNEUUVGRERBNEJDUmtaVFV5WlRsbVl5MWtORGhrTTJJdw0KTlMxaFl6VTNNamcwT1Mxak1XVm1aVEEyTlFBQ0FRSUNBU0F3Q2dZSUtvWklodmNOQWdjd0hRWUpZSVpJQVdVRA0KQkFFcUJCQm5MNVhSZWZLb2IwdEZOWDcwWklqZw==}sN+DIEdLHPeXyJgvw17eEySjTpZdrQ10sPM+rT39tkvclGGF7OHrp9/uJmsKBd6t DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [97589] Imc7SLxVIB5pzs0VGR2mbWzANQ+AAE2PgoJE5Vl5OWU= DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [97764] [30/Oct/2020:23:35:06.719279892 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [97886] [30/Oct/2020:23:35:06.725661056 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb650000:1604115301:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98007] [30/Oct/2020:23:35:06.730840114 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb6a0000:1604115306:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98175] [30/Oct/2020:23:35:06.736683069 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98357] [30/Oct/2020:23:35:06.741398760 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98541] [30/Oct/2020:23:35:06.746184181 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb64000000000000 / remotecsn:5f9cdb61000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98675] [30/Oct/2020:23:35:06.978384079 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb64000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98790] [30/Oct/2020:23:35:06.983287281 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb6a0001:1604115306:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [98971] [30/Oct/2020:23:35:06.988251636 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [99127] [30/Oct/2020:23:35:06.992401627 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [99340] [30/Oct/2020:23:35:06.996796574 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb64000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [99521] [30/Oct/2020:23:35:07.003359342 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [99677] [30/Oct/2020:23:35:07.008130913 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [99890] [30/Oct/2020:23:35:07.011897942 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb64000000010000 5f9cdb64 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [100186] [30/Oct/2020:23:35:07.016162304 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax () csnMax (5f9cdb64000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb64000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [100310] [30/Oct/2020:23:35:07.020272690 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [100582] [30/Oct/2020:23:35:07.026471528 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=0 sent=0 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [100731] [30/Oct/2020:23:35:07.030519886 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No changes to send DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [100895] [30/Oct/2020:23:35:07.038017419 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101065] [30/Oct/2020:23:35:07.043197435 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101239] [30/Oct/2020:23:35:07.047725078 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101361] [30/Oct/2020:23:35:08.700884453 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb6a0001:1604115306:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101482] [30/Oct/2020:23:35:08.711090109 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb6c0000:1604115308:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101645] [30/Oct/2020:23:35:08.717307234 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb6c000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [101825] [30/Oct/2020:23:35:08.721748276 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0e4000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102070] [30/Oct/2020:23:35:08.727148218 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102245] [30/Oct/2020:23:35:08.731433362 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb6c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102383] [30/Oct/2020:23:35:08.736658023 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102513] [30/Oct/2020:23:35:08.740771515 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102651] [30/Oct/2020:23:35:08.745051528 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102779] [30/Oct/2020:23:35:08.750313038 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [102894] [30/Oct/2020:23:35:08.754826531 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103069] [30/Oct/2020:23:35:08.764382442 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103252] [30/Oct/2020:23:35:08.771680127 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103424] [30/Oct/2020:23:35:08.777937598 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103592] [30/Oct/2020:23:35:08.784787328 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103774] [30/Oct/2020:23:35:08.789260169 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [103889] [30/Oct/2020:23:35:08.793989766 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb6c0002:1604115308:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104070] [30/Oct/2020:23:35:08.798226674 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104226] [30/Oct/2020:23:35:08.802834944 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104439] [30/Oct/2020:23:35:08.807900301 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb64000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104620] [30/Oct/2020:23:35:08.812907073 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104776] [30/Oct/2020:23:35:08.817670881 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [104989] [30/Oct/2020:23:35:08.821812099 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6c000000010000 5f9cdb6c DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [105151] [30/Oct/2020:23:35:08.826098639 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8800 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [105378] [30/Oct/2020:23:35:08.832668438 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8800 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [105674] [30/Oct/2020:23:35:08.840241073 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb6c000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb64000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [105787] [30/Oct/2020:23:35:08.856213871 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb64000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [105978] [30/Oct/2020:23:35:08.870098719 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb64000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106134] [30/Oct/2020:23:35:08.874684631 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106246] [30/Oct/2020:23:35:08.882233588 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106466] [30/Oct/2020:23:35:08.887752747 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb6c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106598] [30/Oct/2020:23:35:08.893889858 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106796] [30/Oct/2020:23:35:08.899023087 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [106928] [30/Oct/2020:23:35:08.908293255 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107243] [30/Oct/2020:23:35:08.913715353 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb6c000000010000) csnMax (5f9cdb6c000000010000) csnBuf (5f9cdb6c000000010000) csnConsumerMax (5f9cdb6c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107367] [30/Oct/2020:23:35:08.919177534 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107551] [30/Oct/2020:23:35:08.926686869 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107663] [30/Oct/2020:23:35:08.931216949 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107796] [30/Oct/2020:23:35:08.942755385 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [107926] [30/Oct/2020:23:35:08.947070990 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 11, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108059] [30/Oct/2020:23:35:08.958608082 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108192] [30/Oct/2020:23:35:08.967756521 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108325] [30/Oct/2020:23:35:08.976111885 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108458] [30/Oct/2020:23:35:08.985276791 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108591] [30/Oct/2020:23:35:08.999907097 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108724] [30/Oct/2020:23:35:09.022851730 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108837] [30/Oct/2020:23:35:09.043890949 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 11 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [108945] [30/Oct/2020:23:35:09.053753213 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109060] [30/Oct/2020:23:35:09.062815347 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109169] [30/Oct/2020:23:35:09.069856110 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109441] [30/Oct/2020:23:35:09.080283887 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109605] [30/Oct/2020:23:35:09.105592515 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109775] [30/Oct/2020:23:35:09.112536807 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [109949] [30/Oct/2020:23:35:09.122283564 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110057] [30/Oct/2020:23:35:10.220824209 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110165] [30/Oct/2020:23:35:10.317446404 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110287] [30/Oct/2020:23:35:10.337423561 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb6c0002:1604115308:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110408] [30/Oct/2020:23:35:10.343718408 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb6e0000:1604115310:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110571] [30/Oct/2020:23:35:10.349621745 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb6e000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110751] [30/Oct/2020:23:35:10.354750174 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0ec000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [110996] [30/Oct/2020:23:35:10.364037771 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111171] [30/Oct/2020:23:35:10.375834532 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb6e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111309] [30/Oct/2020:23:35:10.387124951 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111439] [30/Oct/2020:23:35:10.391090821 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111577] [30/Oct/2020:23:35:10.395234293 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111705] [30/Oct/2020:23:35:10.402249875 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111820] [30/Oct/2020:23:35:10.409145895 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [111995] [30/Oct/2020:23:35:10.417629551 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [112178] [30/Oct/2020:23:35:10.422977827 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [112350] [30/Oct/2020:23:35:10.427782062 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [112518] [30/Oct/2020:23:35:10.433146923 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [112700] [30/Oct/2020:23:35:10.443117434 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [112884] [30/Oct/2020:23:35:10.448776235 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb6e000000000000 / remotecsn:5f9cdb64000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113018] [30/Oct/2020:23:35:10.722714757 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb64000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113203] [30/Oct/2020:23:35:10.764084698 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - Schema checking successful: ok to push the schema (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113318] [30/Oct/2020:23:35:11.120152550 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb6e0002:1604115310:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113440] [30/Oct/2020:23:35:11.127036758 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb6e0002:1604115310:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113561] [30/Oct/2020:23:35:11.134297229 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb6f0000:1604115311:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113742] [30/Oct/2020:23:35:11.139619859 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [113898] [30/Oct/2020:23:35:11.144797864 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [114111] [30/Oct/2020:23:35:11.149068284 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6c000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [114292] [30/Oct/2020:23:35:11.154899153 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [114448] [30/Oct/2020:23:35:11.160109434 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [114661] [30/Oct/2020:23:35:11.166813575 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6e000000010000 5f9cdb6e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [114823] [30/Oct/2020:23:35:11.171918649 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8800 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [115050] [30/Oct/2020:23:35:11.176478159 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8800 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [115346] [30/Oct/2020:23:35:11.180771335 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb6e000000010000) csnBuf (5f9cdb6c000000010000) csnConsumerMax (5f9cdb6c000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [115459] [30/Oct/2020:23:35:11.185907609 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb6c000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [115650] [30/Oct/2020:23:35:11.190190672 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb6c000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [115806] [30/Oct/2020:23:35:11.194394094 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116026] [30/Oct/2020:23:35:11.200151762 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb6e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116224] [30/Oct/2020:23:35:11.208094432 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116336] [30/Oct/2020:23:35:11.212946680 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116651] [30/Oct/2020:23:35:11.219324124 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb6e000000010000) csnMax (5f9cdb6e000000010000) csnBuf (5f9cdb6e000000010000) csnConsumerMax (5f9cdb6e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116775] [30/Oct/2020:23:35:11.233116572 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [116959] [30/Oct/2020:23:35:11.242340307 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117071] [30/Oct/2020:23:35:11.246680448 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117204] [30/Oct/2020:23:35:11.251531558 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117334] [30/Oct/2020:23:35:11.258881078 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 18, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117467] [30/Oct/2020:23:35:11.263444211 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117600] [30/Oct/2020:23:35:11.273649021 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117733] [30/Oct/2020:23:35:11.281703599 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117866] [30/Oct/2020:23:35:11.293102724 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [117999] [30/Oct/2020:23:35:11.306363234 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118132] [30/Oct/2020:23:35:11.326787178 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118245] [30/Oct/2020:23:35:11.351690072 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 18 18 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118354] [30/Oct/2020:23:35:11.365392994 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118626] [30/Oct/2020:23:35:11.371157814 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118790] [30/Oct/2020:23:35:11.392648515 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [118960] [30/Oct/2020:23:35:11.403024776 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119134] [30/Oct/2020:23:35:11.408362232 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119242] [30/Oct/2020:23:35:11.426192237 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119413] [30/Oct/2020:23:35:11.434609572 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now disabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119588] [30/Oct/2020:23:35:11.442119836 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119763] [30/Oct/2020:23:35:11.452235228 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [119935] [30/Oct/2020:23:35:11.458839438 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120110] [30/Oct/2020:23:35:11.465258819 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120275] [30/Oct/2020:23:35:11.542310378 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_stop - agmt="cn=201" (ci-vm-10-0-139-146:39201): Protocol stopped after 0 seconds DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120403] [30/Oct/2020:23:35:11.548313796 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120588] [30/Oct/2020:23:35:11.554359775 -0400] - DEBUG - NSMMReplicationPlugin - Database RUV: {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6e000000010000 5f9cdb6e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120763] [30/Oct/2020:23:35:11.560064505 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120871] [30/Oct/2020:23:35:11.564621076 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [120979] [30/Oct/2020:23:35:16.589133375 -0400] - DEBUG - replication - copy_operation_parameters - replica is null. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [121149] [30/Oct/2020:23:35:16.594803098 -0400] - DEBUG - NSMMReplicationPlugin - agmt_set_enabled_from_entry: agreement is now enabled (agmt="cn=201" (ci-vm-10-0-139-146:39201)) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [121324] [30/Oct/2020:23:35:16.600910344 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [121499] [30/Oct/2020:23:35:16.605677716 -0400] - DEBUG - NSMMReplicationPlugin - close_connection_internal - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Disconnected from the consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [121671] [30/Oct/2020:23:35:16.610907156 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: start -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [121840] [30/Oct/2020:23:35:16.616646820 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Trying non-secure slapi_ldap_init_ext DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122368] [30/Oct/2020:23:35:16.623225521 -0400] - DEBUG - NSMMReplicationPlugin - conn_connect - agmt="cn=201" (ci-vm-10-0-139-146:39201) - binddn = cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com, passwd = {AES-TUhNR0NTcUdTSWIzRFFFRkRUQm1NRVVHQ1NxR1NJYjNEUUVGRERBNEJDUmtaVFV5WlRsbVl5MWtORGhrTTJJdw0KTlMxaFl6VTNNamcwT1Mxak1XVm1aVEEyTlFBQ0FRSUNBU0F3Q2dZSUtvWklodmNOQWdjd0hRWUpZSVpJQVdVRA0KQkFFcUJCQm5MNVhSZWZLb2IwdEZOWDcwWklqZw==}sN+DIEdLHPeXyJgvw17eEySjTpZdrQ10sPM+rT39tkvclGGF7OHrp9/uJmsKBd6t DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122413] Imc7SLxVIB5pzs0VGR2mbWzANQ+AAE2PgoJE5Vl5OWU= DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122588] [30/Oct/2020:23:35:16.635936075 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - No linger to cancel on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122710] [30/Oct/2020:23:35:16.642719505 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb6f0000:1604115311:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122831] [30/Oct/2020:23:35:16.648038625 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb740000:1604115316:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [122999] [30/Oct/2020:23:35:16.656558060 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123181] [30/Oct/2020:23:35:16.661661882 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123365] [30/Oct/2020:23:35:16.671206115 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Checking consumer schema localcsn:5f9cdb6e000000000000 / remotecsn:5f9cdb64000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123499] [30/Oct/2020:23:35:16.937594933 -0400] - DEBUG - NSMMReplicationPlugin - conn_push_schema - [S] Reread remotecsn:5f9cdb6e000000000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123614] [30/Oct/2020:23:35:16.942356266 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb740001:1604115316:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123795] [30/Oct/2020:23:35:16.949182424 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [123951] [30/Oct/2020:23:35:16.953860233 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [124164] [30/Oct/2020:23:35:16.960708674 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6e000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [124345] [30/Oct/2020:23:35:16.965079776 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [124501] [30/Oct/2020:23:35:16.970780257 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [124714] [30/Oct/2020:23:35:16.976442055 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6e000000010000 5f9cdb6e DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125010] [30/Oct/2020:23:35:16.982820649 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax () csnMax (5f9cdb6e000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb6e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125134] [30/Oct/2020:23:35:16.987978868 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125406] [30/Oct/2020:23:35:16.994112823 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=0 sent=0 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125555] [30/Oct/2020:23:35:16.998915816 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No changes to send DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125719] [30/Oct/2020:23:35:17.008412641 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [125889] [30/Oct/2020:23:35:17.015581371 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126063] [30/Oct/2020:23:35:17.020850719 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126185] [30/Oct/2020:23:35:18.610437584 -0400] - DEBUG - _csngen_adjust_local_time - gen state before 5f9cdb740001:1604115316:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126306] [30/Oct/2020:23:35:18.619525203 -0400] - DEBUG - _csngen_adjust_local_time - gen state after 5f9cdb760000:1604115318:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126469] [30/Oct/2020:23:35:18.624884073 -0400] - DEBUG - NSMMReplicationPlugin - ruv_add_csn_inprogress - Successfully inserted csn 5f9cdb76000000010000 into pending list DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126649] [30/Oct/2020:23:35:18.629079912 -0400] - DEBUG - NSMMReplicationPlugin - purge_entry_state_information - From entry cn=test_entry,dc=example,dc=com up to CSN 5f93a0ee000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [126894] [30/Oct/2020:23:35:18.633075394 -0400] - DEBUG - NSMMReplicationPlugin - write_changelog_and_ruv - Writing change for cn=test_entry,dc=example,dc=com (uniqid: d5d44709-1b2911eb-b8fbcfd7-92dab025, optype: 8) to changelog csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127069] [30/Oct/2020:23:35:18.636627753 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - cl5WriteOperationTxn - Successfully written entry with csn (5f9cdb76000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127207] [30/Oct/2020:23:35:18.640355453 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: committing all csns for csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127337] [30/Oct/2020:23:35:18.645110820 -0400] - DEBUG - NSMMReplicationPlugin - csnplCommitALL: processing data csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127475] [30/Oct/2020:23:35:18.649246272 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Successfully committed csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127603] [30/Oct/2020:23:35:18.653481931 -0400] - DEBUG - NSMMReplicationPlugin - ruv_update_ruv - Rolled up to csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127718] [30/Oct/2020:23:35:18.657821919 -0400] - DEBUG - replication - multimaster_mmr_postop - error 0 for operation 561. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [127893] [30/Oct/2020:23:35:18.677714525 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128076] [30/Oct/2020:23:35:18.693658936 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: wait_for_changes -> ready_to_acquire_replica DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128248] [30/Oct/2020:23:35:18.698367744 -0400] - DEBUG - NSMMReplicationPlugin - conn_cancel_linger - agmt="cn=201" (ci-vm-10-0-139-146:39201) - Canceling linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128416] [30/Oct/2020:23:35:18.707056158 -0400] - DEBUG - NSMMReplicationPlugin - acquire_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Replica was successfully acquired. DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128598] [30/Oct/2020:23:35:18.713651234 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: ready_to_acquire_replica -> sending_updates DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128713] [30/Oct/2020:23:35:18.717016028 -0400] - DEBUG - csngen_adjust_time - gen state before 5f9cdb760002:1604115318:0:0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [128894] [30/Oct/2020:23:35:18.724987140 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Consumer RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129050] [30/Oct/2020:23:35:18.727934261 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129263] [30/Oct/2020:23:35:18.733907438 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb6e000000010000 00000000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129444] [30/Oct/2020:23:35:18.739203789 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - _cl5PositionCursorForReplay - (agmt="cn=201" (ci-vm-10-0-139-146:39201)): Supplier RUV: DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129600] [30/Oct/2020:23:35:18.743698683 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replicageneration} 5f9cdb11000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129813] [30/Oct/2020:23:35:18.746909684 -0400] - DEBUG - NSMMReplicationPlugin - agmt="cn=201" (ci-vm-10-0-139-146:39201): {replica 1 ldap://localhost.localdomain:39001} 5f9cdb11000100010000 5f9cdb76000000010000 5f9cdb76 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [129975] [30/Oct/2020:23:35:18.753100726 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - found thread private buffer cache 0x7f328f6a8900 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [130202] [30/Oct/2020:23:35:18.756442236 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_buffer - _pool is 0x7f32bd1fb2e0 _pool->pl_busy_lists is 0x7f328f704ae0 _pool->pl_busy_lists->bl_buffers is 0x7f328f6a8900 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [130498] [30/Oct/2020:23:35:18.760341615 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_initial_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 0) - csnPrevMax () csnMax (5f9cdb76000000010000) csnBuf (00000000000000000000) csnConsumerMax (5f9cdb6e000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [130611] [30/Oct/2020:23:35:18.765041902 -0400] - DEBUG - clcache_initial_anchorcsn - anchor is now: 5f9cdb6e000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [130802] [30/Oct/2020:23:35:18.768995102 -0400] - DEBUG - NSMMReplicationPlugin - changelog program - agmt="cn=201" (ci-vm-10-0-139-146:39201): CSN 5f9cdb6e000000010000 found, position set for replay DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [130958] [30/Oct/2020:23:35:18.773545848 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_get_next_change - load=1 rec=1 csn=5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131070] [30/Oct/2020:23:35:18.777275704 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Starting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131202] [30/Oct/2020:23:35:18.782273943 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131422] [30/Oct/2020:23:35:18.786611642 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Sending modify operation (dn="cn=test_entry,dc=example,dc=com" csn=5f9cdb76000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131554] [30/Oct/2020:23:35:18.790678115 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131752] [30/Oct/2020:23:35:18.797564502 -0400] - DEBUG - NSMMReplicationPlugin - replay_update - agmt="cn=201" (ci-vm-10-0-139-146:39201): Consumer successfully sent operation with csn 5f9cdb76000000010000 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [131884] [30/Oct/2020:23:35:18.808537112 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132199] [30/Oct/2020:23:35:18.813546522 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_adjust_anchorcsn - agmt="cn=201" (ci-vm-10-0-139-146:39201) - (cscb 0 - state 1) - csnPrevMax (5f9cdb76000000010000) csnMax (5f9cdb76000000010000) csnBuf (5f9cdb76000000010000) csnConsumerMax (5f9cdb76000000010000) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132323] [30/Oct/2020:23:35:18.822887022 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_load_buffer - rc=-30988 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132507] [30/Oct/2020:23:35:18.839328869 -0400] - DEBUG - NSMMReplicationPlugin - send_updates - agmt="cn=201" (ci-vm-10-0-139-146:39201): No more updates to send (cl5GetNextOperationToReplay) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132619] [30/Oct/2020:23:35:18.843333470 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 0 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132752] [30/Oct/2020:23:35:18.848047830 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [132882] [30/Oct/2020:23:35:18.853067333 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Result 3, 0, 0, 11, (null) DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133015] [30/Oct/2020:23:35:18.857368160 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133148] [30/Oct/2020:23:35:18.864596528 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133281] [30/Oct/2020:23:35:18.870785873 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133414] [30/Oct/2020:23:35:18.881428854 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133547] [30/Oct/2020:23:35:18.894147759 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133680] [30/Oct/2020:23:35:18.919422541 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain - Read result for message_id 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133793] [30/Oct/2020:23:35:18.948618194 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_waitfor_async_results - 11 11 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [133902] [30/Oct/2020:23:35:18.955679716 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_result_threadmain exiting DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [134174] [30/Oct/2020:23:35:18.963056568 -0400] - DEBUG - agmt="cn=201" (ci-vm-10-0-139-146:39201) - clcache_return_buffer - session end: state=5 load=1 sent=1 skipped=0 skipped_new_rid=0 skipped_csn_gt_cons_maxcsn=0 skipped_up_to_date=0 skipped_csn_gt_ruv=0 skipped_csn_covered=0 DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [134338] [30/Oct/2020:23:35:18.970664096 -0400] - DEBUG - NSMMReplicationPlugin - release_replica - agmt="cn=201" (ci-vm-10-0-139-146:39201): Successfully released consumer DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [134508] [30/Oct/2020:23:35:18.977818704 -0400] - DEBUG - NSMMReplicationPlugin - conn_start_linger -agmt="cn=201" (ci-vm-10-0-139-146:39201) - Beginning linger on the connection DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [134682] [30/Oct/2020:23:35:18.989043399 -0400] - DEBUG - NSMMReplicationPlugin - repl5_inc_run - agmt="cn=201" (ci-vm-10-0-139-146:39201): State: sending_updates -> wait_for_changes DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:62 _pattern_errorlog: [134682] DEBUG  tests.suites.schema.schema_replication_test:schema_replication_test.py:67 _pattern_errorlog: end at offset 134682 INFO  tests.suites.schema.schema_replication_test:schema_replication_test.py:693 Testcase PASSED
Passed suites/schema/schema_test.py::test_schema_comparewithfiles 0.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.schema.schema_test:schema_test.py:125 Running test_schema_comparewithfiles... WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /etc/dirsrv/slapd-standalone1/schema/99user.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10mep-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60autofs.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10automember-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/25java-object.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/00core.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/30ns-common.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60pam-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60posix-winsync-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/28pilot.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/02common.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-directory.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10dna-plugin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-admin.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/20subscriber.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/10rfc2307compat.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60samba3.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60trust.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60mozilla.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60pureftpd.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc4524.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60nss-ldap.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-certificate.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60eduperson.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc2927.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60acctpolicy.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-web.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60sudo.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60rfc3712.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-mail.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/01core389.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/05rfc4523.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/50ns-value.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60sabayon.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/06inetorgperson.ldif as a schema file - skipping WARNING  tests.suites.schema.schema_test:schema_test.py:137 Unable to parse /usr/share/dirsrv/schema/60rfc2739.ldif as a schema file - skipping INFO  tests.suites.schema.schema_test:schema_test.py:166 test_schema_comparewithfiles: PASSED
Passed suites/setup_ds/dscreate_test.py::test_setup_ds_minimal_dry 0.18
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
-------------------------------Captured log call--------------------------------
DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone INFO  LogCapture.SetupDs:setup.py:670 NOOP: Dry run requested DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
Passed suites/setup_ds/dscreate_test.py::test_setup_ds_minimal 25.06
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
-------------------------------Captured log call--------------------------------
DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:566 Allocate <class 'lib389.DirSrv'> with /var/run/slapd-standalone.socket DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with localhost:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with localhost:54321 DEBUG  lib389:nss_ssl.py:197 nss cmd: /usr/bin/certutil -N -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:199 nss output: DEBUG  lib389.nss_ssl:nss_ssl.py:314 nss cmd: /usr/bin/certutil -L -n Self-Signed-CA -d /etc/dirsrv/ssca/ DEBUG  lib389:nss_ssl.py:559 CSR subject -> CN=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com,givenName=f014ac92-df6e-4bba-a53f-0d5fdd512162,O=testing,L=389ds,ST=Queensland,C=AU DEBUG  lib389:nss_ssl.py:560 CSR alt_names -> ['ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com'] DEBUG  lib389:nss_ssl.py:592 nss cmd: /usr/bin/certutil -R --keyUsage digitalSignature,nonRepudiation,keyEncipherment,dataEncipherment --nsCertType sslClient,sslServer --extKeyUsage clientAuth,serverAuth -s CN=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com,givenName=f014ac92-df6e-4bba-a53f-0d5fdd512162,O=testing,L=389ds,ST=Queensland,C=AU -8 ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com -g 4096 -d /etc/dirsrv/slapd-standalone -z /etc/dirsrv/slapd-standalone/noise.txt -f /etc/dirsrv/slapd-standalone/pwdfile.txt -a -o /etc/dirsrv/slapd-standalone/Server-Cert.csr DEBUG  lib389.nss_ssl:nss_ssl.py:618 nss cmd: /usr/bin/certutil -C -d /etc/dirsrv/ssca/ -f /etc/dirsrv/ssca//pwdfile.txt -v 24 -a -i /etc/dirsrv/slapd-standalone/Server-Cert.csr -o /etc/dirsrv/slapd-standalone/Server-Cert.crt -c Self-Signed-CA DEBUG  lib389:nss_ssl.py:242 nss cmd: /usr/bin/openssl rehash /etc/dirsrv/slapd-standalone DEBUG  lib389:nss_ssl.py:646 nss cmd: /usr/bin/certutil -A -n Self-Signed-CA -t CT,, -a -i /etc/dirsrv/slapd-standalone/ca.crt -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:661 nss cmd: /usr/bin/certutil -A -n Server-Cert -t ,, -a -i /etc/dirsrv/slapd-standalone/Server-Cert.crt -d /etc/dirsrv/slapd-standalone -f /etc/dirsrv/slapd-standalone/pwdfile.txt DEBUG  lib389:nss_ssl.py:670 nss cmd: /usr/bin/certutil -V -d /etc/dirsrv/slapd-standalone -n Server-Cert -u YCV DEBUG  lib389.utils:utils.py:284 port 636 already in [389, 636, 3268, 3269, 7389], skipping port relabel DEBUG  lib389.utils:utils.py:315 CMD: semanage port -a -t ldap_port_t -p tcp 54321 ; STDOUT: ; STDERR: DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1077 systemd status -> True DEBUG  lib389:__init__.py:937 open(): Connecting to uri ldapi://%2Fvar%2Frun%2Fslapd-standalone.socket DEBUG  lib389:__init__.py:945 Using dirsrv ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:954 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:967 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:975 Using certificate policy 1 DEBUG  lib389:__init__.py:976 ldap.OPT_X_TLS_REQUIRE_CERT = 1 DEBUG  lib389:__init__.py:1009 open(): Using root autobind ... DEBUG  lib389:__init__.py:1030 open(): bound as cn=Directory Manager DEBUG  lib389:__init__.py:1692 Retrieving entry with [('',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: vendorVersion: 389-Directory/2.0.0.0 B2020.305.0012 ] DEBUG  lib389:__init__.py:937 open(): Connecting to uri ldapi://%2Fvar%2Frun%2Fslapd-standalone.socket DEBUG  lib389:__init__.py:945 Using dirsrv ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:954 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:967 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:975 Using certificate policy 1 DEBUG  lib389:__init__.py:976 ldap.OPT_X_TLS_REQUIRE_CERT = 1 DEBUG  lib389:__init__.py:1009 open(): Using root autobind ... DEBUG  lib389:__init__.py:1030 open(): bound as cn=Directory Manager DEBUG  lib389:__init__.py:1692 Retrieving entry with [('',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: vendorVersion: 389-Directory/2.0.0.0 B2020.305.0012 ] DEBUG  Config:_mapped_object.py:434 cn=config set REPLACE: ('nsslapd-secureport', '636') DEBUG  Config:_mapped_object.py:434 cn=config set REPLACE: ('nsslapd-security', 'on') DEBUG  Config:_mapped_object.py:434 cn=config set REPLACE: ('nsslapd-rootpw', '********') DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1146 systemd status -> True DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1077 systemd status -> True DEBUG  lib389:__init__.py:937 open(): Connecting to uri ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321/ DEBUG  lib389:__init__.py:945 Using dirsrv ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:954 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:967 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:975 Using certificate policy 1 DEBUG  lib389:__init__.py:976 ldap.OPT_X_TLS_REQUIRE_CERT = 1 DEBUG  lib389:__init__.py:1030 open(): bound as cn=Directory Manager DEBUG  lib389:__init__.py:1692 Retrieving entry with [('',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: vendorVersion: 389-Directory/2.0.0.0 B2020.305.0012 ] DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1146 systemd status -> True DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1077 systemd status -> True DEBUG  lib389:__init__.py:937 open(): Connecting to uri ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321/ DEBUG  lib389:__init__.py:945 Using dirsrv ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:954 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:967 Using external ca certificate /etc/dirsrv/slapd-standalone DEBUG  lib389:__init__.py:975 Using certificate policy 1 DEBUG  lib389:__init__.py:976 ldap.OPT_X_TLS_REQUIRE_CERT = 1 DEBUG  lib389:__init__.py:1030 open(): bound as cn=Directory Manager DEBUG  lib389:__init__.py:1692 Retrieving entry with [('',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: vendorVersion: 389-Directory/2.0.0.0 B2020.305.0012 ] DEBUG  lib389.remove_ds:remove.py:38 Removing instance standalone DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config nsslapd-bakdir: /var/lib/dirsrv/slapd-standalone/bak ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config nsslapd-certdir: /etc/dirsrv/slapd-standalone ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config,cn=ldbm database,cn=plugins,cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config,cn=ldbm database,cn=plugins,cn=config nsslapd-directory: /var/lib/dirsrv/slapd-standalone/db ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=bdb,cn=config,cn=ldbm database,cn=plugins,cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=bdb,cn=config,cn=ldbm database,cn=plugins,cn=config nsslapd-db-home-directory: /var/lib/dirsrv/slapd-standalone/db ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config,cn=ldbm database,cn=plugins,cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config,cn=ldbm database,cn=plugins,cn=config nsslapd-directory: /var/lib/dirsrv/slapd-standalone/db ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config,cn=ldbm database,cn=plugins,cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config,cn=ldbm database,cn=plugins,cn=config nsslapd-directory: /var/lib/dirsrv/slapd-standalone/db ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config nsslapd-ldifdir: /var/lib/dirsrv/slapd-standalone/ldif ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config nsslapd-lockdir: /var/lock/dirsrv/slapd-standalone ] DEBUG  lib389:__init__.py:1692 Retrieving entry with [('cn=config',)] DEBUG  lib389:__init__.py:1702 Retrieved entry [dn: cn=config nsslapd-instancedir: /usr/lib64/dirsrv/slapd-standalone ] DEBUG  lib389.remove_ds:remove.py:67 Checking for instance marker at /etc/dirsrv/slapd-standalone/dse.ldif DEBUG  lib389.remove_ds:remove.py:72 Found instance marker at /etc/dirsrv/slapd-standalone/dse.ldif! Proceeding to remove ... DEBUG  lib389.remove_ds:remove.py:76 Stopping instance standalone DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389:__init__.py:1146 systemd status -> True DEBUG  lib389.remove_ds:remove.py:79 Found instance marker at /etc/dirsrv/slapd-standalone/dse.ldif! Proceeding to remove ... DEBUG  lib389.remove_ds:remove.py:83 Stopping instance standalone DEBUG  lib389:__init__.py:1173 systemd status -> True DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/bak DEBUG  lib389.remove_ds:remove.py:92 Removing /etc/dirsrv/slapd-standalone DEBUG  lib389.remove_ds:remove.py:92 Removing /etc/dirsrv/slapd-standalone DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/db DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/db DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/db/../ DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/changelogdb DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lib/dirsrv/slapd-standalone/ldif DEBUG  lib389.remove_ds:remove.py:92 Removing /var/lock/dirsrv/slapd-standalone DEBUG  lib389.remove_ds:remove.py:92 Removing /var/log/dirsrv/slapd-standalone DEBUG  lib389.remove_ds:remove.py:92 Removing /usr/lib64/dirsrv/slapd-standalone DEBUG  lib389.remove_ds:remove.py:92 Removing /etc/sysconfig/dirsrv-standalone DEBUG  lib389.remove_ds:remove.py:101 Removing the systemd symlink DEBUG  lib389.remove_ds:remove.py:108 CMD: systemctl disable dirsrv@standalone ; STDOUT: ; STDERR: Removed /etc/systemd/system/multi-user.target.wants/dirsrv@standalone.service. DEBUG  lib389.remove_ds:remove.py:110 Removing /etc/tmpfiles.d/dirsrv-standalone.conf DEBUG  lib389.remove_ds:remove.py:119 Removing the port labels DEBUG  lib389.remove_ds:remove.py:149 Moving /etc/dirsrv/slapd-standalone to /etc/dirsrv/slapd-standalone.removed DEBUG  lib389.remove_ds:remove.py:159 Complete
Passed suites/setup_ds/dscreate_test.py::test_setup_ds_inf_minimal 0.00
-------------------------------Captured log setup-------------------------------
DEBUG  lib389:dscreate_test.py:42 Instance allocated DEBUG  lib389:__init__.py:554 Allocate <class 'lib389.DirSrv'> with None DEBUG  lib389:__init__.py:577 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:602 Allocate <class 'lib389.DirSrv'> with ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:54321 DEBUG  lib389:__init__.py:742 list instance not found in /etc/dirsrv/slapd-standalone/dse.ldif: standalone
Passed suites/setup_ds/remove_test.py::test_basic[True] 4.11
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/setup_ds/remove_test.py::test_basic[False] 4.68
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr1-ldap.MOD_ADD-exp_values0-vucsn] 0.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b28bd6e1-a117-4f38-8e46-9ce3765df185 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 98ab4f93-332a-46ce-90d6-96d08bc7855c / got description=b28bd6e1-a117-4f38-8e46-9ce3765df185) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:99 Add user: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:107 Check if list of description attrs present for: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:110 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: state1test
Passed suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr2-ldap.MOD_ADD-exp_values1-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:99 Add user: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:107 Check if list of description attrs present for: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:110 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: state1test
Passed suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr3-ldap.MOD_ADD-exp_values2-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:99 Add user: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:107 Check if list of description attrs present for: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:110 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: state1test
Passed suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr4-ldap.MOD_REPLACE-exp_values3-adcsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:99 Add user: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:107 Check if list of description attrs present for: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:110 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: state1test
Passed suites/state/mmt_state_test.py::test_check_desc_attr_state[description-Test1usr4-ldap.MOD_DELETE-exp_values4-vdcsn] 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:99 Add user: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:107 Check if list of description attrs present for: state1test INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:110 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: state1test
Passed suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestCN1-ldap.MOD_ADD-exp_values0-vucsn] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:153 Add user: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:166 Check if list of cn attrs present for: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:168 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: TestCNusr1
Passed suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestCN2-ldap.MOD_ADD-exp_values1-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:153 Add user: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:166 Check if list of cn attrs present for: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:168 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: TestCNusr1
Passed suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestnewCN3-ldap.MOD_REPLACE-exp_values2-adcsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:153 Add user: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:166 Check if list of cn attrs present for: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:168 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: TestCNusr1
Passed suites/state/mmt_state_test.py::test_check_cn_attr_state[cn-TestnewCN3-ldap.MOD_DELETE-None-None] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:153 Add user: TestCNusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:168 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: TestCNusr1
Passed suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-Chinese-ldap.MOD_REPLACE-exp_values0-vucsn] 0.07
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:206 Add user: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:219 Check if list of cn attrs present for: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:221 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: Langusr1
Passed suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-French-ldap.MOD_ADD-None-None] 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:206 Add user: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:221 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: Langusr1
Passed suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-German-ldap.MOD_REPLACE-exp_values2-adcsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:206 Add user: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:219 Check if list of cn attrs present for: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:221 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: Langusr1
Passed suites/state/mmt_state_test.py::test_check_single_value_attr_state[preferredlanguage-German-ldap.MOD_DELETE-exp_values3-vdcsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:206 Add user: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:219 Check if list of cn attrs present for: Langusr1 INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:221 Checking for operational attributes INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: Langusr1
Passed suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower1-ldap.MOD_ADD-exp_values0-vucsn] 0.06
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:276 Add user: roomoffice1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:285 Check if list of roomnumber;office attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:287 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: roomoffice1usr
Passed suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower2-ldap.MOD_ADD-exp_values1-vucsn] 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:276 Add user: roomoffice1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:285 Check if list of roomnumber;office attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:287 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: roomoffice1usr
Passed suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower3-ldap.MOD_ADD-exp_values2-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:276 Add user: roomoffice1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:285 Check if list of roomnumber;office attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:287 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: roomoffice1usr
Passed suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower4-ldap.MOD_REPLACE-exp_values3-adcsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:276 Add user: roomoffice1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:285 Check if list of roomnumber;office attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:287 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: roomoffice1usr
Passed suites/state/mmt_state_test.py::test_check_subtype_attr_state[roomnumber;office-Tower4-ldap.MOD_DELETE-exp_values4-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:276 Add user: roomoffice1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:285 Check if list of roomnumber;office attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:287 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: roomoffice1usr
Passed suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef1-ldap.MOD_ADD-exp_values0-vucsn] 0.05
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:342 Add user: testJpeg1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:351 Check if list of jpeg attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:353 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: testJpeg1usr
Passed suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_ADD-exp_values1-vucsn] 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:342 Add user: testJpeg1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:351 Check if list of jpeg attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:353 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: testJpeg1usr
Passed suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef3-ldap.MOD_ADD-exp_values2-vucsn] 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:342 Add user: testJpeg1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:351 Check if list of jpeg attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:353 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: testJpeg1usr
Passed suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_REPLACE-exp_values3-adcsn] 0.03
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:342 Add user: testJpeg1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:351 Check if list of jpeg attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:353 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: testJpeg1usr
Passed suites/state/mmt_state_test.py::test_check_jpeg_attr_state[jpegphoto-thedeadbeef2-ldap.MOD_DELETE-exp_values4-vdcsn] 0.04
-------------------------------Captured log call--------------------------------
INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:342 Add user: testJpeg1usr INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:351 Check if list of jpeg attributes are present for a given entry INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:353 Checking if operational attributes are present for cn INFO  tests.suites.state.mmt_state_test:mmt_state_test.py:36 Checking if operational attrs vucsn, adcsn and vdcsn present for: testJpeg1usr
Passed suites/syncrepl_plugin/basic_test.py::test_syncrepl_basic 5.52
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/syncrepl_plugin/basic_test.py::test_sync_repl_mep 26.13
------------------------------Captured stdout call------------------------------
syncrepl_poll: LDAP error (%s) {'result': -1, 'desc': "Can't contact LDAP server", 'ctrls': []}
Passed suites/syncrepl_plugin/openldap_test.py::test_syncrepl_openldap 5.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/syntax/acceptance_test.py::test_valid 5.72
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:54 Clean the error log INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:58 Attempting to add task entry... INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:65 Found 0 invalid entries - Success
Passed suites/syntax/acceptance_test.py::test_invalid_uidnumber 6.58
-------------------------------Captured log call--------------------------------
INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:91 Clean the error log INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:98 Attempting to add task entry... INFO  lib389.topologies.tests.suites.syntax.acceptance_test:acceptance_test.py:105 Found an invalid entry with wrong uidNumber - Success
Passed suites/syntax/mr_test.py::test_sss_mr 6.41
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.syntax.mr_test:mr_test.py:42 Creating LDIF... INFO  tests.suites.syntax.mr_test:mr_test.py:47 Importing LDIF... INFO  tests.suites.syntax.mr_test:mr_test.py:52 Search using server side sorting using undefined mr in the attr... INFO  tests.suites.syntax.mr_test:mr_test.py:62 Test PASSED
Passed suites/tls/cipher_test.py::test_long_cipher_list 17.80
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/tls/ssl_version_test.py::test_ssl_version_range 66.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.ssl_version_test:ssl_version_test.py:36 default min: TLS1.2 max: TLS1.3
Passed suites/tls/tls_cert_namespace_test.py::test_pem_cert_in_private_namespace 8.69
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:54 Enable TLS INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:57 Checking PrivateTmp value INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:59 Command used : systemctl show -p PrivateTmp dirsrv@standalone1.service INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:63 Check files in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Self-Signed-CA.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Server-Cert-Key.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:68 Check that Server-Cert.pem is present in private /tmp INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:71 Check instance cert directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Self-Signed-CA.pem is not present in /etc/dirsrv/slapd-standalone1/ directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Server-Cert-Key.pem is not present in /etc/dirsrv/slapd-standalone1/ directory INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:75 Check that Server-Cert.pem is not present in /etc/dirsrv/slapd-standalone1/ directory
Passed suites/tls/tls_cert_namespace_test.py::test_cert_category_authority 12.26
-------------------------------Captured log call--------------------------------
INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:106 Enable TLS INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:109 Get certificate path INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:115 Check that Self-Signed-CA.pem is present INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:119 Trust the certificate INFO  tests.suites.tls.tls_cert_namespace_test:tls_cert_namespace_test.py:122 Search if our certificate has category: authority
Passed suites/tls/tls_check_crl_test.py::test_tls_check_crl 11.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed suites/tls/tls_ldaps_only_test.py::test_tls_ldaps_only 23.92
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47560_test.py::test_ticket47560 17.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:877 fixupMemberOf task fixupmemberof_10302020_234308 for basedn dc=example,dc=com completed successfully
Passed tickets/ticket47573_test.py::test_ticket47573_init 0.03
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect b3ea7a7e-6d8a-4c6b-b35f-840cefa829c2 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:123 test_ticket47573_init topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c3103d60> (master <lib389.DirSrv object at 0x7f61c314e970>, consumer <lib389.DirSrv object at 0x7f61c314eee0>
Passed tickets/ticket47573_test.py::test_ticket47573_one 1.28
-------------------------------Captured log call--------------------------------
DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:147 test_ticket47573_one topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c3103d60> (master <lib389.DirSrv object at 0x7f61c314e970>, consumer <lib389.DirSrv object at 0x7f61c314eee0> DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:113 trigger_schema_push: receive 0 (expected 1) DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:161 test_ticket47573_one master_schema_csn=b'5f9cdd75000000000000' DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:162 ctest_ticket47573_one onsumer_schema_csn=b'5f9cdd75000000000000' DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:47 _pattern_errorlog: start at offset 0 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [38] 389-Directory/2.0.0.0 B2020.305.0012 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [95] localhost.localdomain:39001 (/etc/dirsrv/slapd-master1) DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [96] DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [200] [30/Oct/2020:23:43:24.267057146 -0400] - INFO - main - 389-Directory/2.0.0.0 B2020.305.0012 starting up DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [308] [30/Oct/2020:23:43:24.272595234 -0400] - INFO - main - Setting the maximum file descriptor limit to: 524288 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [416] [30/Oct/2020:23:43:25.157799311 -0400] - INFO - PBKDF2_SHA256 - Based on CPU performance, chose 2048 rounds DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [530] [30/Oct/2020:23:43:25.175143565 -0400] - INFO - bdb_config_upgrade_dse_info - create config entry from old config DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [632] [30/Oct/2020:23:43:25.188811544 -0400] - NOTICE - bdb_start_autotune - found 7980868k physical memory DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [728] [30/Oct/2020:23:43:25.194673254 -0400] - NOTICE - bdb_start_autotune - found 7309916k available DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [835] [30/Oct/2020:23:43:25.198288207 -0400] - NOTICE - bdb_start_autotune - cache autosizing: db cache: 498804k DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [938] [30/Oct/2020:23:43:25.203322525 -0400] - NOTICE - bdb_start_autotune - total cache size: 408620441 B; DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1074] [30/Oct/2020:23:43:25.480047322 -0400] - INFO - slapd_daemon - slapd started. Listening on All Interfaces port 39001 for LDAP requests DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1199] [30/Oct/2020:23:43:25.495128909 -0400] - INFO - slapd_daemon - Listening on /var/run/slapd-master1.socket for LDAPI requests DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1358] [30/Oct/2020:23:43:25.536226945 -0400] - INFO - postop_modify_config_dse - The change of nsslapd-securePort will not take effect until the server is restarted DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1475] [30/Oct/2020:23:43:25.548263957 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1653] [30/Oct/2020:23:43:26.342857354 -0400] - INFO - op_thread_cleanup - slapd shutting down - signaling operation threads - op stack size 1 max work q size 1 max work q stack size 1 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1783] [30/Oct/2020:23:43:26.352343067 -0400] - INFO - slapd_daemon - slapd shutting down - closing down internal subsystems and plugins DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1886] [30/Oct/2020:23:43:26.363913698 -0400] - INFO - bdb_pre_close - Waiting for 4 database threads to stop DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [1983] [30/Oct/2020:23:43:27.928756325 -0400] - INFO - bdb_pre_close - All database threads now stopped DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2094] [30/Oct/2020:23:43:27.963060991 -0400] - INFO - ldbm_back_instance_set_destructor - Set of instances destroyed DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2255] [30/Oct/2020:23:43:27.970652171 -0400] - INFO - connection_post_shutdown_cleanup - slapd shutting down - freed 1 work q stack objects - freed 1 op stack objects DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2325] [30/Oct/2020:23:43:27.977259073 -0400] - INFO - main - slapd stopped. DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2429] [30/Oct/2020:23:43:29.392404708 -0400] - INFO - main - 389-Directory/2.0.0.0 B2020.305.0012 starting up DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2537] [30/Oct/2020:23:43:29.402197517 -0400] - INFO - main - Setting the maximum file descriptor limit to: 524288 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2645] [30/Oct/2020:23:43:30.389657141 -0400] - INFO - PBKDF2_SHA256 - Based on CPU performance, chose 2048 rounds DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2762] [30/Oct/2020:23:43:30.398097882 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2864] [30/Oct/2020:23:43:30.413827421 -0400] - NOTICE - bdb_start_autotune - found 7980868k physical memory DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [2960] [30/Oct/2020:23:43:30.417552839 -0400] - NOTICE - bdb_start_autotune - found 7310892k available DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3067] [30/Oct/2020:23:43:30.423570680 -0400] - NOTICE - bdb_start_autotune - cache autosizing: db cache: 498804k DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3197] [30/Oct/2020:23:43:30.427158776 -0400] - NOTICE - bdb_start_autotune - cache autosizing: userRoot entry cache (1 total): 1376256k DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3323] [30/Oct/2020:23:43:30.432265928 -0400] - NOTICE - bdb_start_autotune - cache autosizing: userRoot dn cache (1 total): 196608k DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3427] [30/Oct/2020:23:43:30.440381130 -0400] - NOTICE - bdb_start_autotune - total cache size: 1834683801 B; DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3563] [30/Oct/2020:23:43:30.556743050 -0400] - INFO - slapd_daemon - slapd started. Listening on All Interfaces port 39001 for LDAP requests DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3688] [30/Oct/2020:23:43:30.565087743 -0400] - INFO - slapd_daemon - Listening on /var/run/slapd-master1.socket for LDAPI requests DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [3876] [30/Oct/2020:23:43:41.125668085 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding the replication changelog RUV, this may take several minutes... DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4056] [30/Oct/2020:23:43:41.130155324 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding replication changelog RUV complete. Result 0 (Success) DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4244] [30/Oct/2020:23:43:41.135770660 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding the replication changelog RUV, this may take several minutes... DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4424] [30/Oct/2020:23:43:41.142968413 -0400] - NOTICE - NSMMReplicationPlugin - changelog program - _cl5ConstructRUV - Rebuilding replication changelog RUV complete. Result 0 (Success) DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4595] [30/Oct/2020:23:43:42.352346749 -0400] - INFO - NSMMReplicationPlugin - repl5_tot_run - Beginning total update of replica "agmt="cn=temp_201" (ci-vm-10-0-139-146:39201)". DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4779] [30/Oct/2020:23:43:42.359087297 -0400] - NOTICE - NSMMReplicationPlugin - replica_subentry_check - Need to create replication keep alive entry <cn=repl keep alive 1,dc=example,dc=com> DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4924] [30/Oct/2020:23:43:42.364919784 -0400] - INFO - NSMMReplicationPlugin - replica_subentry_create - add dn: cn=repl keep alive 1,dc=example,dc=com DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4941] objectclass: top DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4967] objectclass: ldapsubentry DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [4997] objectclass: extensibleObject DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [5019] cn: repl keep alive 1 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [5206] [30/Oct/2020:23:43:45.010353357 -0400] - INFO - NSMMReplicationPlugin - repl5_tot_run - Finished total update of replica "agmt="cn=temp_201" (ci-vm-10-0-139-146:39201)". Sent 16 entries. DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [5206] DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:59 _pattern_errorlog: end at offset 5206
Passed tickets/ticket47573_test.py::test_ticket47573_two 1.33
-------------------------------Captured log call--------------------------------
DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:113 trigger_schema_push: receive b'1' (expected 2) DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:192 test_ticket47573_two master_schema_csn=b'5f9cdd77000000000000' DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:193 test_ticket47573_two consumer_schema_csn=b'5f9cdd77000000000000' DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:47 _pattern_errorlog: start at offset 5207 DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:54 _pattern_errorlog: [5207] DEBUG  tests.tickets.ticket47573_test:ticket47573_test.py:59 _pattern_errorlog: end at offset 5207
Passed tickets/ticket47573_test.py::test_ticket47573_three 1.01
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47573_test:ticket47573_test.py:228 Testcase PASSED
Passed tickets/ticket47619_test.py::test_ticket47619_init 6.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect cd01e91d-e1b2-4128-b2fd-4a49f58fe6ec / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47619_test.py:48 test_ticket47619_init topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c31619d0> INFO  lib389:ticket47619_test.py:61 test_ticket47619_init: 100 entries ADDed other_entry[0..99]
Passed tickets/ticket47619_test.py::test_ticket47619_create_index 3.66
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog INFO  lib389:backend.py:80 List backend with suffix=cn=changelog
Passed tickets/ticket47619_test.py::test_ticket47619_reindex 16.49
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234438 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234440 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234443 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234445 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234447 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234449 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234451 completed successfully INFO  lib389:tasks.py:798 Index task index_attrs_10302020_234453 completed successfully
Passed tickets/ticket47619_test.py::test_ticket47619_check_indexed_search 0.21
No log output captured.
Passed tickets/ticket47640_test.py::test_ticket47640 0.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47640_test:ticket47640_test.py:65 Add operation correctly rejected. INFO  tests.tickets.ticket47640_test:ticket47640_test.py:75 Test complete
Passed tickets/ticket47653MMR_test.py::test_ticket47653_init 0.24
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 73bc73f0-e527-4aa2-8ae1-268f27098901 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d6894e2e-209a-4d9f-8ca1-0752d942a591 / got description=73bc73f0-e527-4aa2-8ae1-268f27098901) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47653MMR_test.py:72 Add OCticket47653 that allows 'member' attribute INFO  lib389:ticket47653MMR_test.py:77 Add cn=bind_entry, dc=example,dc=com
Passed tickets/ticket47653MMR_test.py::test_ticket47653_add 5.30
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47653MMR_test.py:114 ######################### ADD ###################### INFO  lib389:ticket47653MMR_test.py:117 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:147 Try to add Add cn=test_entry, dc=example,dc=com (aci is missing): dn: cn=test_entry, dc=example,dc=com cn: test_entry member: cn=bind_entry, dc=example,dc=com objectclass: top objectclass: person objectclass: OCticket47653 postalAddress: here postalCode: 1234 sn: test_entry INFO  lib389:ticket47653MMR_test.py:151 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:ticket47653MMR_test.py:155 Bind as cn=Directory Manager and add the ADD SELFDN aci INFO  lib389:ticket47653MMR_test.py:168 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:173 Try to add Add cn=test_entry, dc=example,dc=com (member is missing) INFO  lib389:ticket47653MMR_test.py:181 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:ticket47653MMR_test.py:188 Try to add Add cn=test_entry, dc=example,dc=com (with several member values) INFO  lib389:ticket47653MMR_test.py:191 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:ticket47653MMR_test.py:195 Try to add Add cn=test_entry, dc=example,dc=com should be successful INFO  lib389:ticket47653MMR_test.py:206 Try to retrieve cn=test_entry, dc=example,dc=com from Master2 INFO  lib389:ticket47653MMR_test.py:218 Update cn=test_entry, dc=example,dc=com on M2
Passed tickets/ticket47653MMR_test.py::test_ticket47653_modify 4.24
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47653MMR_test.py:248 Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:251 ######################### MODIFY ###################### INFO  lib389:ticket47653MMR_test.py:255 Try to modify cn=test_entry, dc=example,dc=com (aci is missing) INFO  lib389:ticket47653MMR_test.py:259 Exception (expected): INSUFFICIENT_ACCESS INFO  lib389:ticket47653MMR_test.py:263 Bind as cn=Directory Manager and add the WRITE SELFDN aci INFO  lib389:ticket47653MMR_test.py:277 M1: Bind as cn=bind_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:282 M1: Try to modify cn=test_entry, dc=example,dc=com. It should succeeds INFO  lib389:ticket47653MMR_test.py:286 M1: Bind as cn=Directory Manager INFO  lib389:ticket47653MMR_test.py:289 M1: Check the update of cn=test_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:295 M2: Bind as cn=Directory Manager INFO  lib389:ticket47653MMR_test.py:297 M2: Try to retrieve cn=test_entry, dc=example,dc=com INFO  lib389:ticket47653MMR_test.py:311 M2: Update cn=test_entry, dc=example,dc=com (bound as cn=bind_entry, dc=example,dc=com) INFO  lib389:ticket47653MMR_test.py:329 M1: Bind as cn=Directory Manager INFO  lib389:ticket47653MMR_test.py:331 M1: Check cn=test_entry, dc=example,dc=com.postalCode=1929)
Passed tickets/ticket47676_test.py::test_ticket47676_init 0.95
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 85d6e9a5-5359-4e14-a891-102f302022de / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 6fe02494-d667-4000-b091-06384a3f6c6b / got description=85d6e9a5-5359-4e14-a891-102f302022de) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47676_test.py:88 Add OCticket47676 that allows 'member' attribute INFO  lib389:ticket47676_test.py:93 Add cn=bind_entry, dc=example,dc=com
Passed tickets/ticket47676_test.py::test_ticket47676_skip_oc_at 4.62
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47676_test.py:120 ######################### ADD ###################### INFO  lib389:ticket47676_test.py:123 Bind as cn=Directory Manager and add the add the entry with specific oc INFO  lib389:ticket47676_test.py:140 Try to add Add cn=test_entry, dc=example,dc=com should be successful INFO  lib389:ticket47676_test.py:147 Try to retrieve cn=test_entry, dc=example,dc=com from Master2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 546a28fa-9d4c-4dff-ac61-8039d62ad28f / got description=6fe02494-d667-4000-b091-06384a3f6c6b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389:ticket47676_test.py:152 Update cn=test_entry, dc=example,dc=com on M2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b69460dc-71c7-4f59-bf0a-44ffb643c215 / got description=546a28fa-9d4c-4dff-ac61-8039d62ad28f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed tickets/ticket47676_test.py::test_ticket47676_reject_action 13.27
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47676_test.py:163 ######################### REJECT ACTION ###################### INFO  lib389:ticket47676_test.py:177 Add OC2ticket47676 on M1 INFO  lib389:ticket47676_test.py:182 Check OC2ticket47676 is in M1 INFO  lib389:ticket47676_test.py:193 Update cn=test_entry, dc=example,dc=com on M1 INFO  lib389:ticket47676_test.py:198 Check updated cn=test_entry, dc=example,dc=com on M2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e3efa250-dd5b-4e3e-81a8-49ff8e55f0ec / got description=b69460dc-71c7-4f59-bf0a-44ffb643c215) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389:ticket47676_test.py:205 Check OC2ticket47676 is not in M2 INFO  lib389:ticket47676_test.py:215 ######################### NO MORE REJECT ACTION ###################### INFO  lib389:ticket47676_test.py:226 Update cn=test_entry, dc=example,dc=com on M1 INFO  lib389:ticket47676_test.py:231 Check updated cn=test_entry, dc=example,dc=com on M2 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c6337234-a3b6-469b-b638-52555125d600 / got description=e3efa250-dd5b-4e3e-81a8-49ff8e55f0ec) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389:ticket47676_test.py:237 Check OC2ticket47676 is in M2
Passed tickets/ticket47714_test.py::test_ticket47714_init 0.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Testing Ticket 47714 - [RFE] Update lastLoginTime also in Account Policy plugin if account lockout is based on passwordExpirationTime. INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:55 ######################### Adding Account Policy entry: cn=Account Inactivation Policy,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:60 ######################### Adding Test User entry: uid=ticket47714user,dc=example,dc=com ######################
Passed tickets/ticket47714_test.py::test_ticket47714_run_0 10.48
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Account Policy - No new attr alwaysRecordLoginAttr in config INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:96 ######################### Bind as uid=ticket47714user,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:109 ######################### Bind as uid=ticket47714user,dc=example,dc=com again ###################### INFO  lib389.utils:ticket47714_test.py:122 First lastLoginTime: b'20201031034708Z', Second lastLoginTime: b'20201031034710Z' INFO  lib389.utils:ticket47714_test.py:133 ######################### cn=config,cn=Account Policy Plugin,cn=plugins,cn=config ###################### INFO  lib389.utils:ticket47714_test.py:134 accountInactivityLimit: b'1' INFO  lib389.utils:ticket47714_test.py:135 ######################### cn=config,cn=Account Policy Plugin,cn=plugins,cn=config DONE ###################### INFO  lib389.utils:ticket47714_test.py:137 ######################### Bind as uid=ticket47714user,dc=example,dc=com again to fail ###################### INFO  lib389.utils:ticket47714_test.py:141 CONSTRAINT VIOLATION Constraint violation INFO  lib389.utils:ticket47714_test.py:142 uid=ticket47714user,dc=example,dc=com was successfully inactivated.
Passed tickets/ticket47714_test.py::test_ticket47714_run_1 6.12
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47714_test.py:38 ############################################### INFO  lib389:ticket47714_test.py:39 ####### INFO  lib389:ticket47714_test.py:40 ####### Account Policy - With new attr alwaysRecordLoginAttr in config INFO  lib389:ticket47714_test.py:41 ####### INFO  lib389:ticket47714_test.py:42 ############################################### INFO  lib389.utils:ticket47714_test.py:179 ######################### Bind as uid=ticket47714user,dc=example,dc=com ###################### INFO  lib389.utils:ticket47714_test.py:191 ######################### Bind as uid=ticket47714user,dc=example,dc=com again ###################### INFO  lib389.utils:ticket47714_test.py:203 First lastLoginTime: b'20201031034718Z', Second lastLoginTime: b'20201031034719Z' INFO  lib389:ticket47714_test.py:206 ticket47714 was successfully verified.
Passed tickets/ticket47721_test.py::test_ticket47721_init 1.31
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fce1e698-2e9e-4c83-a451-e22ae844d472 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 568c1549-45b1-4dd4-b804-058b4056ec0c / got description=fce1e698-2e9e-4c83-a451-e22ae844d472) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47721_test.py:98 Add cn=bind_entry, dc=example,dc=com
Passed tickets/ticket47721_test.py::test_ticket47721_0 2.52
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f72f5458-a4c6-42a7-b4a4-47648da9812f / got description=568c1549-45b1-4dd4-b804-058b4056ec0c) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect f72f5458-a4c6-42a7-b4a4-47648da9812f / got description=568c1549-45b1-4dd4-b804-058b4056ec0c) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed tickets/ticket47721_test.py::test_ticket47721_1 3.93
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47721_test:ticket47721_test.py:127 Running test 1... INFO  lib389:ticket47721_test.py:132 Add (M2) b"( ATticket47721-oid NAME 'ATticket47721' DESC 'test AT ticket 47721' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 X-ORIGIN ( 'Test 47721' 'user defined' ) )" INFO  lib389:ticket47721_test.py:136 Chg (M2) b"( 2.16.840.1.113730.3.1.569 NAME 'cosPriority' DESC 'Netscape defined attribute type' SYNTAX 1.3.6.1.4.1.1466.115.121.1.27 X-ORIGIN 'Netscape Directory Server' )" INFO  lib389:ticket47721_test.py:140 Add (M2) b"( OCticket47721-oid NAME 'OCticket47721' DESC 'An group of related automount objects' SUP top STRUCTURAL MUST ou X-ORIGIN 'draft-howard-rfc2307bis' )" INFO  lib389:ticket47721_test.py:144 Chg (M2) b"( 5.3.6.1.1.1.2.0 NAME 'trustAccount' DESC 'Sets trust accounts information' SUP top AUXILIARY MUST trustModel MAY ( accessTo $ ou ) X-ORIGIN 'nss_ldap/pam_ldap' )" INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect c696c610-227e-453f-84ff-e597e0dbc7ff / got description=f72f5458-a4c6-42a7-b4a4-47648da9812f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:158 Master 1 schemaCSN: b'5f9cde6e000000000000' DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:159 Master 2 schemaCSN: b'5f9cde6e000000000000'
Passed tickets/ticket47721_test.py::test_ticket47721_2 3.32
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47721_test:ticket47721_test.py:163 Running test 2... INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 2f202e2a-3f4d-4100-b6e5-14740f5f578a / got description=c696c610-227e-453f-84ff-e597e0dbc7ff) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:176 Master 1 schemaCSN: b'5f9cde6e000000000000' DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:177 Master 2 schemaCSN: b'5f9cde6e000000000000'
Passed tickets/ticket47721_test.py::test_ticket47721_3 13.79
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47721_test:ticket47721_test.py:195 Running test 3... INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:ticket47721_test.py:203 Update schema (M2) b"( ATtest3-oid NAME 'ATtest3' DESC 'test AT ticket 47721' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 X-ORIGIN ( 'Test 47721' 'user defined' ) )" INFO  lib389:ticket47721_test.py:208 Update schema (M2) b"( OCtest3-oid NAME 'OCtest3' DESC 'An group of related automount objects' SUP top STRUCTURAL MUST ou X-ORIGIN 'draft-howard-rfc2307bis' )" INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5c701ccc-84e7-4095-842d-4046cedc63da / got description=2f202e2a-3f4d-4100-b6e5-14740f5f578a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:223 Master 1 schemaCSN: b'5f9cde6e000000000000' DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:224 Master 2 schemaCSN: b'5f9cde7b000000000000'
Passed tickets/ticket47721_test.py::test_ticket47721_4 5.99
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47721_test:ticket47721_test.py:245 Running test 4... INFO  lib389:ticket47721_test.py:248 Update schema (M1) b"( ATtest4-oid NAME 'ATtest4' DESC 'test AT ticket 47721' SYNTAX 1.3.6.1.4.1.1466.115.121.1.15 X-ORIGIN ( 'Test 47721' 'user defined' ) )" INFO  lib389:ticket47721_test.py:252 Update schema (M1) b"( OCtest4-oid NAME 'OCtest4' DESC 'An group of related automount objects' SUP top STRUCTURAL MUST ou X-ORIGIN 'draft-howard-rfc2307bis' )" INFO  lib389:ticket47721_test.py:255 trigger replication M1->M2: to update the schema INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4fb81fd2-caf6-48e1-9f82-ea1ab39c04c1 / got description=5c701ccc-84e7-4095-842d-4046cedc63da) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4fb81fd2-caf6-48e1-9f82-ea1ab39c04c1 / got description=5c701ccc-84e7-4095-842d-4046cedc63da) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389:ticket47721_test.py:264 trigger replication M1->M2: to push the schema INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect fdf9c0c6-3729-4463-95e0-b61dbf8136c3 / got description=4fb81fd2-caf6-48e1-9f82-ea1ab39c04c1) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:276 Master 1 schemaCSN: b'5f9cde84000000000000' DEBUG  tests.tickets.ticket47721_test:ticket47721_test.py:277 Master 2 schemaCSN: b'5f9cde84000000000000'
Passed tickets/ticket47787_test.py::test_ticket47787_init 3.99
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 5f397e86-9f38-4065-a117-d500904ccaaf / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect a353f2a6-0195-4069-a0a3-0630f41d519a / got description=5f397e86-9f38-4065-a117-d500904ccaaf) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47787_test.py:294 ######################### INITIALIZATION ###################### INFO  lib389:ticket47787_test.py:297 Add cn=bind_entry,dc=example,dc=com INFO  lib389:ticket47787_test.py:305 Add cn=staged user,dc=example,dc=com INFO  lib389:ticket47787_test.py:312 Add cn=accounts,dc=example,dc=com
Passed tickets/ticket47787_test.py::test_ticket47787_2 17.32
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47787_test.py:69 ############################################### INFO  lib389:ticket47787_test.py:70 ####### INFO  lib389:ticket47787_test.py:71 ####### test_ticket47787_2 INFO  lib389:ticket47787_test.py:72 ####### INFO  lib389:ticket47787_test.py:73 ############################################### INFO  lib389:ticket47787_test.py:59 Bind as cn=Directory Manager INFO  lib389:ticket47787_test.py:59 Bind as cn=Directory Manager INFO  lib389:ticket47787_test.py:159 ######################### Pause RA M1<->M2 ###################### INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:ticket47787_test.py:209 ######################### MOD cn=new_account18,cn=staged user,dc=example,dc=com (M2) ###################### INFO  lib389:ticket47787_test.py:200 ######################### DELETE new_account1 (M1) ###################### INFO  lib389:ticket47787_test.py:209 ######################### MOD cn=new_account18,cn=staged user,dc=example,dc=com (M2) ###################### INFO  lib389:ticket47787_test.py:209 ######################### MOD cn=new_account19,cn=staged user,dc=example,dc=com (M2) ###################### INFO  lib389:ticket47787_test.py:209 ######################### MOD cn=new_account1,cn=staged user,dc=example,dc=com (M2) ###################### INFO  lib389:ticket47787_test.py:209 ######################### MOD cn=new_account19,cn=staged user,dc=example,dc=com (M2) ###################### INFO  lib389:ticket47787_test.py:170 ######################### resume RA M1<->M2 ###################### INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:ticket47787_test.py:388 ######################### Check DEL replicated on M2 ###################### INFO  lib389:ticket47787_test.py:79 ######################### Tombstone on M1 ###################### INFO  lib389:ticket47787_test.py:92 ######################### Tombstone on M2 ###################### INFO  lib389:ticket47787_test.py:96 ######################### Description ###################### DEL M1 - MOD M2 INFO  lib389:ticket47787_test.py:97 M1 only INFO  lib389:ticket47787_test.py:108 M2 only INFO  lib389:ticket47787_test.py:119 M1 differs M2 INFO  lib389:ticket47787_test.py:409 ######################### Check MOD replicated on M1 ######################
Passed tickets/ticket47808_test.py::test_ticket47808_run 3.93
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47808_test.py:36 Bind as cn=Directory Manager INFO  lib389:ticket47808_test.py:39 ######################### SETUP ATTR UNIQ PLUGIN ###################### INFO  lib389:ticket47808_test.py:46 ######################### ADD USER 1 ###################### INFO  lib389:ticket47808_test.py:55 Try to add Add dn: cn=test_entry 1, dc=example,dc=com cn: test_entry 1 objectclass: top objectclass: person sn: test_entry : dn: cn=test_entry 1, dc=example,dc=com cn: test_entry 1 objectclass: top objectclass: person sn: test_entry INFO  lib389:ticket47808_test.py:58 ######################### Restart Server ###################### INFO  lib389:ticket47808_test.py:62 ######################### ADD USER 2 ###################### INFO  lib389:ticket47808_test.py:71 Try to add Add dn: cn=test_entry 2, dc=example,dc=com cn: test_entry 2 objectclass: top objectclass: person sn: test_entry : dn: cn=test_entry 2, dc=example,dc=com cn: test_entry 2 objectclass: top objectclass: person sn: test_entry WARNING  lib389:ticket47808_test.py:75 Adding cn=test_entry 2, dc=example,dc=com failed INFO  lib389:ticket47808_test.py:78 ######################### IS SERVER UP? ###################### INFO  lib389:ticket47808_test.py:81 Yes, it's up. INFO  lib389:ticket47808_test.py:83 ######################### CHECK USER 2 NOT ADDED ###################### INFO  lib389:ticket47808_test.py:84 Try to search cn=test_entry 2, dc=example,dc=com INFO  lib389:ticket47808_test.py:88 Found none INFO  lib389:ticket47808_test.py:90 ######################### DELETE USER 1 ###################### INFO  lib389:ticket47808_test.py:92 Try to delete cn=test_entry 1, dc=example,dc=com
Passed tickets/ticket47823_test.py::test_ticket47823_init 12.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_add 18.97
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:140 Uniqueness not enforced: create the entries INFO  lib389:ticket47823_test.py:155 Uniqueness enforced: checks second entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:140 Uniqueness not enforced: create the entries INFO  lib389:ticket47823_test.py:155 Uniqueness enforced: checks second entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_mod 9.83
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MOD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:193 Uniqueness enforced: checks MOD ADD entry is rejected INFO  lib389:ticket47823_test.py:210 Uniqueness enforced: checks MOD REPLACE entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:193 Uniqueness enforced: checks MOD ADD entry is rejected INFO  lib389:ticket47823_test.py:210 Uniqueness enforced: checks MOD REPLACE entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_one_container_modrdn 9.21
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:237 Uniqueness enforced: checks MODRDN entry is rejected INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:237 Uniqueness enforced: checks MODRDN entry is rejected
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_add 9.54
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_mod 9.81
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MOD) on separated container INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:525 Uniqueness not enforced: if same 'cn' modified (add/replace) on separated containers INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) on separated container INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:531 Uniqueness not enforced: if same 'cn' modified (add/replace) on separated containers
Passed tickets/ticket47823_test.py::test_ticket47823_multi_containers_modrdn 9.25
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With former config (args), check attribute uniqueness with 'cn' (MODRDN) on separated containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ############################################### INFO  lib389:ticket47823_test.py:545 Uniqueness not enforced: checks MODRDN entry is accepted on separated containers INFO  lib389:ticket47823_test.py:548 Uniqueness not enforced: checks MODRDN entry is accepted on separated containers
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_add 4.73
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (ADD) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_mod 5.16
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MOD) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_across_multi_containers_modrdn 4.89
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### With new config (args), check attribute uniqueness with 'cn' (MODRDN) across several containers INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_1 10.93
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg0 is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_2 9.41
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg1 is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_3 10.81
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg0 is missing but new config attrname exists INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_4 9.49
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (old): arg1 is missing but new config exist INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_5 9.45
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-attribute-name is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_6 9.83
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-subtrees is missing INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47823_test.py::test_ticket47823_invalid_config_7 10.08
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47823_test.py:58 ############################################### INFO  lib389:ticket47823_test.py:59 ####### INFO  lib389:ticket47823_test.py:60 ####### Invalid config (new): uniqueness-subtrees are invalid INFO  lib389:ticket47823_test.py:61 ####### INFO  lib389:ticket47823_test.py:62 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_init 4.92
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47828_test.py::test_ticket47828_run_0 0.14
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_1 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_2 0.27
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add a staged entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_3 0.27
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### NO exclude scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_4 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude the provisioning container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_5 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_6 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_7 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_8 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_9 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_10 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_11 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude (in addition) the dummy container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_12 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_13 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_14 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_15 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_16 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an dummy entry and check its ALLOCATED_ATTR not is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_17 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning/Dummy excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_18 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude PROVISIONING and a wrong container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_19 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_20 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_21 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_22 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_23 0.04
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_24 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Provisioning+wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_25 0.01
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Exclude a wrong container INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_26 0.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_27 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an active entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_28 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is not set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_29 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add a staged entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_30 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is set INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47828_test.py::test_ticket47828_run_31 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47828_test.py:42 ############################################### INFO  lib389:ticket47828_test.py:43 ####### INFO  lib389:ticket47828_test.py:44 ####### Wrong container excluded scope: Add an dummy entry and check its ALLOCATED_ATTR is unchanged (!= magic) INFO  lib389:ticket47828_test.py:45 ####### INFO  lib389:ticket47828_test.py:46 ###############################################
Passed tickets/ticket47829_test.py::test_ticket47829_init 4.58
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_1 2.17
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an active user to an active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_2 2.04
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Active user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_3 2.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Active user to a out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_1 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a Active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_2 2.07
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_3 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an Stage user to a out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_1 2.08
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to an active group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_2 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to a Stage group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=stage group,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_out_user_3 2.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### MOD: add an out of scope user to an out of scope group INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=out group,cn=out,dc=example,dc=com: member ->b'cn=out guy,cn=out,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=out guy,cn=out,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=out group,cn=out,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_active_user_1 2.11
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=xactive guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=xactive guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=xactive guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_stage_user_1 1.12
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to Stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_active_user_modrdn_out_user_1 1.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active user to a Active group. Then move Active user to out of scope INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_modrdn_1 1.07
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_modrdn_active_user_1 1.06
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Active INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_mod_stage_user_modrdn_stage_user_1 0.00
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Stage user to a Active group. Then move Stage user to Stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### Return because it requires a fix for 47833 INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ###############################################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_1 2.06
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1 INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:172 delete entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_2 1.12
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1. Then move active user to stage INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_3 1.12
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add active user to G1. Then move active user to out of the scope INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=active guy,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=active guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ###################### INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=active guy ######################
Passed tickets/ticket47829_test.py::test_ticket47829_indirect_active_group_4 1.06
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47829_test.py:62 ############################################### INFO  lib389:ticket47829_test.py:63 ####### INFO  lib389:ticket47829_test.py:64 ####### add an Active group (G1) to an active group (G0). Then add stage user to G1. Then move user to active. Then move it back INFO  lib389:ticket47829_test.py:65 ####### INFO  lib389:ticket47829_test.py:66 ############################################### INFO  lib389:ticket47829_test.py:172 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:173 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:96 !!!!!!! cn=stage guy,cn=accounts,cn=in,dc=example,dc=com: memberof->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com' INFO  lib389:ticket47829_test.py:132 ######################### MODRDN cn=stage guy ###################### INFO  lib389:ticket47829_test.py:116 !!!!!!! cn=indirect active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=active group,cn=accounts,cn=in,dc=example,dc=com'
Passed tickets/ticket47833_test.py::test_ticket47829_init 4.68
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47833_test.py::test_ticket47829_mod_stage_user_modrdn_stage_user_1 1.29
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47833_test.py:58 ############################################### INFO  lib389:ticket47833_test.py:59 ####### INFO  lib389:ticket47833_test.py:60 ####### add an Stage user to a Active group. Then move Stage user to Stage INFO  lib389:ticket47833_test.py:61 ####### INFO  lib389:ticket47833_test.py:62 ############################################### INFO  lib389:ticket47833_test.py:145 add entry cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com INFO  lib389:ticket47833_test.py:146 to group cn=active group,cn=accounts,cn=in,dc=example,dc=com INFO  lib389:ticket47833_test.py:112 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com' INFO  lib389:ticket47833_test.py:128 ######################### MODRDN cn=xstage guy ###################### INFO  lib389:ticket47833_test.py:112 !!!!!!! cn=active group,cn=accounts,cn=in,dc=example,dc=com: member ->b'cn=stage guy,cn=staged users,cn=provisioning,cn=in,dc=example,dc=com'
Passed tickets/ticket47869MMR_test.py::test_ticket47869_init 13.42
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 1045866f-a8e5-447c-b58d-e34894d59306 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3320a58d-9ae3-44a1-902b-8fc66069bc0e / got description=1045866f-a8e5-447c-b58d-e34894d59306) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47869MMR_test.py:51 Add cn=bind_entry, dc=example,dc=com INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect aa4f9472-c721-4be1-9167-bcb5a77bd25a / got description=3320a58d-9ae3-44a1-902b-8fc66069bc0e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 8b865dfa-6f2d-49be-98ad-93e1d637ba44 / got description=aa4f9472-c721-4be1-9167-bcb5a77bd25a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 74ae8a6b-45a0-47a2-86b3-e422c23517ce / got description=8b865dfa-6f2d-49be-98ad-93e1d637ba44) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 048dc520-340d-42c4-b878-9d62380eef56 / got description=74ae8a6b-45a0-47a2-86b3-e422c23517ce) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect b9cffde5-b30d-401d-9cbd-6060c2371895 / got description=048dc520-340d-42c4-b878-9d62380eef56) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e25961c3-55fd-4b62-a185-6d0eb42703bc / got description=b9cffde5-b30d-401d-9cbd-6060c2371895) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 7289cbd2-4a9e-4737-b7da-0e432fe6639b / got description=e25961c3-55fd-4b62-a185-6d0eb42703bc) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 92e627f0-f464-4914-82fc-38be3c249a5e / got description=7289cbd2-4a9e-4737-b7da-0e432fe6639b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 78082beb-54f7-4900-bdc8-2a9e8420493a / got description=92e627f0-f464-4914-82fc-38be3c249a5e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect d97dcfe4-8727-45fc-88eb-027209e246c2 / got description=78082beb-54f7-4900-bdc8-2a9e8420493a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e72ffa78-c900-4ee4-8479-259ed29cd44c / got description=d97dcfe4-8727-45fc-88eb-027209e246c2) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed tickets/ticket47869MMR_test.py::test_ticket47869_check 0.14
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47869MMR_test.py:93 ######################### CHECK nscpentrywsi ###################### INFO  lib389:ticket47869MMR_test.py:95 ##### Master1: Bind as cn=Directory Manager ##### INFO  lib389:ticket47869MMR_test.py:98 Master1: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:102 27 results INFO  lib389:ticket47869MMR_test.py:104 Results: INFO  lib389:ticket47869MMR_test.py:106 dn: dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: ou=groups,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: ou=people,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: uid=demo_user,ou=people,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=demo_group,ou=groups,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=group_admin,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=group_modify,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=user_admin,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=user_modify,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=user_passwd_reset,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=user_private_read,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=replication_managers,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702,ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=bind_entry,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry0,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry1,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry2,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry3,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry4,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry5,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry6,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry7,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry8,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:106 dn: cn=test_entry9,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:110 Master1: count of nscpentrywsi: 27 INFO  lib389:ticket47869MMR_test.py:112 ##### Master2: Bind as cn=Directory Manager ##### INFO  lib389:ticket47869MMR_test.py:115 Master2: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:119 27 results INFO  lib389:ticket47869MMR_test.py:121 Results: INFO  lib389:ticket47869MMR_test.py:123 dn: dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: ou=groups,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: ou=people,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=replication_managers,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=demo_group,ou=groups,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: uid=demo_user,ou=people,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=group_admin,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=group_modify,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=user_admin,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=user_modify,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=user_passwd_reset,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=user_private_read,ou=permissions,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63701,ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:63702,ou=services,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=bind_entry,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry0,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry1,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry2,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry3,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry4,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry5,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry6,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry7,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry8,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:123 dn: cn=test_entry9,dc=example,dc=com INFO  lib389:ticket47869MMR_test.py:127 Master2: count of nscpentrywsi: 27 INFO  lib389:ticket47869MMR_test.py:130 ##### Master1: Bind as cn=bind_entry, dc=example,dc=com ##### INFO  lib389:ticket47869MMR_test.py:133 Master1: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:137 27 results INFO  lib389:ticket47869MMR_test.py:143 Master1: count of nscpentrywsi: 0 INFO  lib389:ticket47869MMR_test.py:146 ##### Master2: Bind as cn=bind_entry, dc=example,dc=com ##### INFO  lib389:ticket47869MMR_test.py:149 Master2: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:153 27 results INFO  lib389:ticket47869MMR_test.py:159 Master2: count of nscpentrywsi: 0 INFO  lib389:ticket47869MMR_test.py:162 ##### Master1: Bind as anonymous ##### INFO  lib389:ticket47869MMR_test.py:165 Master1: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:169 27 results INFO  lib389:ticket47869MMR_test.py:175 Master1: count of nscpentrywsi: 0 INFO  lib389:ticket47869MMR_test.py:178 ##### Master2: Bind as anonymous ##### INFO  lib389:ticket47869MMR_test.py:181 Master2: Calling search_ext... INFO  lib389:ticket47869MMR_test.py:185 27 results INFO  lib389:ticket47869MMR_test.py:191 Master2: count of nscpentrywsi: 0 INFO  lib389:ticket47869MMR_test.py:193 ##### ticket47869 was successfully verified. #####
Passed tickets/ticket47871_test.py::test_ticket47871_init 3.45
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 4d533189-786c-4054-9c5d-6242f0502e96 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47871_test.py:53 test_ticket47871_init topology_m1c1 <lib389.topologies.TopologyMain object at 0x7f61c301c7f0>
Passed tickets/ticket47871_test.py::test_ticket47871_1 1.27
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47871_test.py:71 test_ticket47871_init: 10 entries ADDed other_entry[0..9] INFO  lib389:ticket47871_test.py:78 Added entries are INFO  lib389:ticket47871_test.py:80 changenumber=1,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=2,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=3,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=4,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=5,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=6,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=7,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=8,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=9,cn=changelog INFO  lib389:ticket47871_test.py:80 changenumber=10,cn=changelog
Passed tickets/ticket47871_test.py::test_ticket47871_2 12.03
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47871_test.py:93 Try no 1 it remains 10 entries INFO  lib389:ticket47871_test.py:95 changenumber=1,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=2,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=3,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=4,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=5,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=6,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=7,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=8,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=9,cn=changelog INFO  lib389:ticket47871_test.py:95 changenumber=10,cn=changelog INFO  lib389:ticket47871_test.py:93 Try no 2 it remains 1 entries INFO  lib389:ticket47871_test.py:95 changenumber=10,cn=changelog
Passed tickets/ticket47900_test.py::test_ticket47900 0.85
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47900_test.py:52 Creating Password Administator entry cn=passwd_admin,dc=example,dc=com... INFO  lib389:ticket47900_test.py:62 Configuring password policy... INFO  lib389:ticket47900_test.py:74 Add aci to allow password admin to add/update entries... INFO  lib389:ticket47900_test.py:87 Bind as the Password Administator (before activating)... INFO  lib389:ticket47900_test.py:101 Attempt to add entries with invalid passwords, these adds should fail... INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (2_Short)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (2_Short) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (No_Number)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (No_Number) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (N0Special)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password (N0Special) result (Constraint violation) INFO  lib389:ticket47900_test.py:105 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)... INFO  lib389:ticket47900_test.py:111 Add failed as expected: password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==) result (Constraint violation) INFO  lib389:ticket47900_test.py:123 Activate the Password Administator... INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (2_Short)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (No_Number)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password (N0Special)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:139 Create a regular user entry cn=Joe Schmo,dc=example,dc=com with password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)... INFO  lib389:ticket47900_test.py:142 Succesfully added entry (cn=Joe Schmo,dc=example,dc=com) INFO  lib389:ticket47900_test.py:155 Deactivate Password Administator and try invalid password updates... INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (2_Short) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (No_Number) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password (N0Special) result (Constraint violation) INFO  lib389:ticket47900_test.py:177 Password update failed as expected: password ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==) result (Constraint violation) INFO  lib389:ticket47900_test.py:188 Activate Password Administator and try updates again... INFO  lib389:ticket47900_test.py:205 Password update succeeded (2_Short) INFO  lib389:ticket47900_test.py:205 Password update succeeded (No_Number) INFO  lib389:ticket47900_test.py:205 Password update succeeded (N0Special) INFO  lib389:ticket47900_test.py:205 Password update succeeded ({SSHA}bBy8UdtPZwu8uZna9QOYG3Pr41RpIRVDl8wddw==)
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_positive 5.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.utils:ticket47910_test.py:36 Diable access log buffering INFO  lib389.utils:ticket47910_test.py:39 Do a ldapsearch operation INFO  lib389.utils:ticket47910_test.py:42 sleep for sometime so that access log file get generated
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:79 Running test_ticket47910 - Execute logconv.pl -S -E with random values INFO  lib389.utils:ticket47910_test.py:81 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:85 taking current time with offset of 2 mins and formatting it to feed -E INFO  lib389.utils:ticket47910_test.py:89 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S [30/Oct/2020:23:54:28] -E [30/Oct/2020:23:58:28] /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access Processing 1 Access Log(s)... [001] /var/log/dirsrv/slapd-standalone1/access size (bytes): 9327 Total Log Lines Analysed: 77 ----------- Access Log Output ------------ Start of Logs: 30/Oct/2020:23:54:28 End of Logs: 30/Oct/2020:23:56:27.519292339 Processed Log Time: 0 Hours, 1 Minutes, 59.519289344 Seconds Restarts: 2 Peak Concurrent Connections: 1 Total Operations: 35 Total Results: 33 Overall Performance: 94.3% Total Connections: 3 (0.03/sec) (1.51/min) - LDAP Connections: 1 (0.01/sec) (0.50/min) - LDAPI Connections: 2 (0.02/sec) (1.00/min) - LDAPS Connections: 0 (0.00/sec) (0.00/min) - StartTLS Extended Ops: 0 (0.00/sec) (0.00/min) Searches: 8 (0.07/sec) (4.02/min) Modifications: 4 (0.03/sec) (2.01/min) Adds: 18 (0.15/sec) (9.04/min) Deletes: 0 (0.00/sec) (0.00/min) Mod RDNs: 0 (0.00/sec) (0.00/min) Compares: 0 (0.00/sec) (0.00/min) Binds: 5 (0.04/sec) (2.51/min) Average wtime (wait time): 0.000409625 Average optime (op time): 0.015052586 Average etime (elapsed time): 0.015452119 Proxied Auth Operations: 0 Persistent Searches: 0 Internal Operations: 0 Entry Operations: 0 Extended Operations: 0 Abandoned Requests: 0 Smart Referrals Received: 0 VLV Operations: 0 VLV Unindexed Searches: 0 VLV Unindexed Components: 0 SORT Operations: 0 Entire Search Base Queries: 1 Paged Searches: 0 Unindexed Searches: 0 Unindexed Components: 1 Invalid Attribute Filters: 0 FDs Taken: 3 FDs Returned: 2 Highest FD Taken: 65 Broken Pipes: 0 Connections Reset By Peer: 0 Resource Unavailable: 0 Max BER Size Exceeded: 0 Binds: 5 Unbinds: 1 -------------------------------- - LDAP v2 Binds: 0 - LDAP v3 Binds: 3 - AUTOBINDs(LDAPI): 2 - SSL Client Binds: 0 - Failed SSL Client Binds: 0 - SASL Binds: 2 - EXTERNAL: 2 - Directory Manager Binds: 1 - Anonymous Binds: 0 Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_negative 0.17
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:105 Running test_ticket47910 - Execute logconv.pl -S -E with starttime>endtime INFO  lib389.utils:ticket47910_test.py:107 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:111 taking current time with offset of 2 mins and formatting it to feed -E INFO  lib389.utils:ticket47910_test.py:115 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S [30/Oct/2020:23:58:34] -E [30/Oct/2020:23:54:34] /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access Start time ([30/Oct/2020:23:58:34]) is greater than end time ([30/Oct/2020:23:54:34])! Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_start_end_invalid 0.18
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:128 Running test_ticket47910 - Execute logconv.pl -S -E with invalid timestamp INFO  lib389.utils:ticket47910_test.py:129 Set start time and end time to invalid values INFO  lib389.utils:ticket47910_test.py:133 Executing logconv.pl with -S and -E INFO  lib389.utils:ticket47910_test.py:61 Executing logconv.pl with -S current time and -E end time INFO  lib389.utils:ticket47910_test.py:63 /usr/bin/logconv.pl -S invalid -E invalid /var/log/dirsrv/slapd-standalone1/access INFO  lib389.utils:ticket47910_test.py:66 standard outputAccess Log Analyzer 8.2 Command: logconv.pl /var/log/dirsrv/slapd-standalone1/access The date string (invalid) is invalid, exiting... Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:67 standard errors
Passed tickets/ticket47910_test.py::test_ticket47910_logconv_noaccesslogs 0.18
-------------------------------Captured log call--------------------------------
INFO  lib389.utils:ticket47910_test.py:147 Running test_ticket47910 - Execute logconv.pl without access logs INFO  lib389.utils:ticket47910_test.py:149 taking current time with offset of 2 mins and formatting it to feed -S INFO  lib389.utils:ticket47910_test.py:152 Executing logconv.pl with -S current time INFO  lib389.utils:ticket47910_test.py:154 /usr/bin/logconv.pl -S [30/Oct/2020:23:54:34] INFO  lib389.utils:ticket47910_test.py:157 standard outputThere are no access logs specified, or the tool options have not been used correctly! Cleaning up temp files... Done. INFO  lib389.utils:ticket47910_test.py:158 standard errors
Passed tickets/ticket47920_test.py::test_ticket47920_init 0.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47920_test.py::test_ticket47920_mod_readentry_ctrl 0.01
------------------------------Captured stdout call------------------------------
['final description']
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket47920_test.py:65 ############################################### INFO  lib389:ticket47920_test.py:66 ####### INFO  lib389:ticket47920_test.py:67 ####### MOD: with a readentry control INFO  lib389:ticket47920_test.py:68 ####### INFO  lib389:ticket47920_test.py:69 ############################################### INFO  lib389:ticket47920_test.py:106 Check the initial value of the entry
Passed tickets/ticket47921_test.py::test_ticket47921 0.10
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47921_test:ticket47921_test.py:81 Test complete
Passed tickets/ticket47927_test.py::test_ticket47927_init 4.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47927_test.py::test_ticket47927_one 0.07
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:100 test_ticket47927_one: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:111 test_ticket47927_one: Failed (expected) to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com: Constraint violation
Passed tickets/ticket47927_test.py::test_ticket47927_two 4.42
No log output captured.
Passed tickets/ticket47927_test.py::test_ticket47927_three 0.05
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:151 test_ticket47927_three: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:158 test_ticket47927_three: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com
Passed tickets/ticket47927_test.py::test_ticket47927_four 0.26
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:176 test_ticket47927_four: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:198 test_ticket47927_four: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation
Passed tickets/ticket47927_test.py::test_ticket47927_five 4.13
No log output captured.
Passed tickets/ticket47927_test.py::test_ticket47927_six 0.27
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:240 test_ticket47927_six: Failed (expected) to set the telephonenumber for cn=test_2,cn=enforced_container,dc=example,dc=com: Constraint violation CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:247 test_ticket47927_six: success to set the telephonenumber for cn=test_3,cn=excluded_container,dc=example,dc=com CRITICAL tests.tickets.ticket47927_test:ticket47927_test.py:256 test_ticket47927_six: success to set the telephonenumber for cn=test_4,cn=excluded_bis_container,dc=example,dc=com
Passed tickets/ticket47931_test.py::test_ticket47931 62.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
Exception in thread Thread-65: Traceback (most recent call last): File "/usr/lib64/python3.8/threading.py", line 932, in _bootstrap_inner self.run() File "/export/tests/tickets/ticket47931_test.py", line 36, in run conn.set_option(ldap.OPT_TIMEOUT, self.timeout) File "/usr/local/lib/python3.8/site-packages/lib389/__init__.py", line 180, in inner return f(*args, **kwargs) File "/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py", line 937, in set_option return self._ldap_call(self._l.set_option,option,invalue) File "/usr/local/lib64/python3.8/site-packages/ldap/ldapobject.py", line 362, in __getattr__ raise AttributeError('%s has no attribute %s' % ( AttributeError: DirSrv has no attribute '_l'
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=deadlock INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=deadlock,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=deadlock,cn=ldbm database,cn=plugins,cn=config cn: deadlock nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/deadlock nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=deadlock objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=deadlock",cn=mapping tree,cn=config cn: dc=deadlock nsslapd-backend: deadlock nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=dc\3Ddeadlock,cn=mapping tree,cn=config cn: dc=deadlock nsslapd-backend: deadlock nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  tests.tickets.ticket47931_test:ticket47931_test.py:142 Adding members to the group... INFO  tests.tickets.ticket47931_test:ticket47931_test.py:158 Finished adding members to the group. INFO  tests.tickets.ticket47931_test:ticket47931_test.py:164 Test complete
Passed tickets/ticket47953_test.py::test_ticket47953 5.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:498 Import task import_10302020_235859 for file /var/lib/dirsrv/slapd-standalone1/ldif/ticket47953.ldif completed successfully
Passed tickets/ticket47963_test.py::test_ticket47963 7.98
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47963_test:ticket47963_test.py:145 Test complete
Passed tickets/ticket47970_test.py::test_ticket47970 0.04
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47976_test.py::test_ticket47976_init 4.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47976_test.py::test_ticket47976_1 3.62
No log output captured.
Passed tickets/ticket47976_test.py::test_ticket47976_2 4.12
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47976_test:ticket47976_test.py:99 Test complete INFO  tests.tickets.ticket47976_test:ticket47976_test.py:104 Export LDIF file... INFO  lib389:tasks.py:567 Export task export_10312020_000000 for file /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif completed successfully INFO  tests.tickets.ticket47976_test:ticket47976_test.py:115 Import LDIF file... INFO  lib389:tasks.py:498 Import task import_10312020_000002 for file /var/lib/dirsrv/slapd-standalone1/ldif/export.ldif completed successfully
Passed tickets/ticket47976_test.py::test_ticket47976_3 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket47976_test:ticket47976_test.py:131 Testing if the delete will hang or not INFO  tests.tickets.ticket47976_test:ticket47976_test.py:150 user0 was correctly deleted INFO  tests.tickets.ticket47976_test:ticket47976_test.py:150 user1 was correctly deleted
Passed tickets/ticket47980_test.py::test_ticket47980 2.48
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket47981_test.py::test_ticket47981 3.62
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=o=netscaperoot INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=netscaperoot,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=netscaperoot,cn=ldbm database,cn=plugins,cn=config cn: netscaperoot nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/netscaperoot nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: o=netscaperoot objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="o=netscaperoot",cn=mapping tree,cn=config cn: o=netscaperoot nsslapd-backend: netscaperoot nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=o\3Dnetscaperoot,cn=mapping tree,cn=config cn: o=netscaperoot nsslapd-backend: netscaperoot nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket48005_test.py::test_ticket48005_memberof 13.77
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:86 Ticket 48005 memberof test... INFO  lib389:tasks.py:877 fixupMemberOf task fixupmemberof_10312020_000138 for basedn dc=example,dc=com completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:110 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:119 Ticket 48005 memberof test complete
Passed tickets/ticket48005_test.py::test_ticket48005_automember 20.88
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:138 Ticket 48005 automember test... INFO  tests.tickets.ticket48005_test:ticket48005_test.py:143 Adding automember config INFO  lib389:tasks.py:986 Automember Rebuild Membership task(task-10312020_000151) completedsuccessfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:176 No core files are found INFO  lib389:tasks.py:1039 Automember Export Updates task (task-10312020_000155) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:198 No core files are found INFO  lib389:tasks.py:1087 Automember Map Updates task (task-10312020_000159) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:222 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:231 Ticket 48005 automember test complete
Passed tickets/ticket48005_test.py::test_ticket48005_syntaxvalidate 3.80
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:241 Ticket 48005 syntax validate test... INFO  lib389:tasks.py:1255 Syntax Validate task (task-10312020_000208) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:261 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:265 Ticket 48005 syntax validate test complete
Passed tickets/ticket48005_test.py::test_ticket48005_usn 13.51
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:277 Ticket 48005 usn test... INFO  tests.tickets.ticket48005_test:ticket48005_test.py:285 No user entries. INFO  lib389:tasks.py:1304 USN tombstone cleanup task (task-10312020_000216) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:316 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:324 Ticket 48005 usn test complete
Passed tickets/ticket48005_test.py::test_ticket48005_schemareload 3.71
------------------------------Captured stderr call------------------------------
ls: cannot access '/var/log/dirsrv/slapd-standalone1/core*': No such file or directory
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48005_test:ticket48005_test.py:334 Ticket 48005 schema reload test... INFO  lib389:tasks.py:1169 Schema Reload task (task-10312020_000225) completed successfully INFO  tests.tickets.ticket48005_test:ticket48005_test.py:354 No core files are found INFO  tests.tickets.ticket48005_test:ticket48005_test.py:358 Ticket 48005 schema reload test complete
Passed tickets/ticket48026_test.py::test_ticket48026 5.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48026_test:ticket48026_test.py:114 Test complete
Passed tickets/ticket48109_test.py::test_ticket48109 29.07
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48109_test:ticket48109_test.py:32 Test case 0 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:77 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:85 l1: [31/Oct/2020:00:03:17.532234676 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000419889 optime=0.000557270 etime=0.000971614 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:88 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:93 Entry uid=a* found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:100 Test case 0 - OK - substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:119 Test case 1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:163 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:171 l1: [31/Oct/2020:00:03:17.532234676 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000419889 optime=0.000557270 etime=0.000971614 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:174 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:179 Entry uid=*b found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:186 Test case 1 - OK - substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:208 Test case 2 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:259 match: conn=1 op=3 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:267 l1: [31/Oct/2020:00:03:17.532234676 -0400] conn=1 op=3 RESULT err=0 tag=101 nentries=1 wtime=0.000419889 optime=0.000557270 etime=0.000971614 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:270 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:275 Entry uid=c* found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:282 Test case 2-1 - OK - correct substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:294 match: conn=1 op=4 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:302 l1: [31/Oct/2020:00:03:36.955003362 -0400] conn=1 op=4 RESULT err=0 tag=101 nentries=1 wtime=0.000175374 optime=0.000353559 etime=0.000526388 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:305 match: nentires=1 INFO  tests.tickets.ticket48109_test:ticket48109_test.py:310 Entry uid=*2 found. INFO  tests.tickets.ticket48109_test:ticket48109_test.py:317 Test case 2-2 - OK - correct substr index used INFO  tests.tickets.ticket48109_test:ticket48109_test.py:331 Testcase PASSED
Passed tickets/ticket48170_test.py::test_ticket48170 0.01
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48170_test:ticket48170_test.py:29 Index update correctly rejected INFO  tests.tickets.ticket48170_test:ticket48170_test.py:36 Test complete
Passed tickets/ticket48194_test.py::test_init 7.45
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Testing Ticket 48194 - harden the list of ciphers available by default INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:57 ######################### enable SSL in the directory server with all ciphers ######################
Passed tickets/ticket48194_test.py::test_run_0 6.18
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 1 - Check the ciphers availability for "+all"; allowWeakCipher: on INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:131 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Passed tickets/ticket48194_test.py::test_run_3 5.96
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 4 - Check the ciphers availability for "-all" INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:199 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Passed tickets/ticket48194_test.py::test_run_9 7.44
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 10 - Check no nsSSL3Ciphers (default setting) with no errorlog-level & allowWeakCipher on INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:316 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake successfully INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256\n'
Passed tickets/ticket48194_test.py::test_run_11 5.48
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48194_test.py:40 ############################################### INFO  lib389:ticket48194_test.py:41 ####### Test Case 12 - Check nsSSL3Ciphers: +fortezza, which is not supported INFO  lib389:ticket48194_test.py:42 ############################################### INFO  lib389.utils:ticket48194_test.py:337 ######################### Restarting the server ###################### INFO  lib389.utils:ticket48194_test.py:86 Testing DES-CBC3-SHA -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher DES-CBC3-SHA INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n' INFO  lib389.utils:ticket48194_test.py:86 Testing AES256-SHA256 -- expect to handshake failed INFO  lib389.utils:ticket48194_test.py:92 Running cmdline: /usr/bin/openssl s_client -connect localhost:63601 -cipher AES256-SHA256 INFO  lib389.utils:ticket48194_test.py:105 Found: b'New, (NONE), Cipher is (NONE)\n'
Passed tickets/ticket48212_test.py::test_ticket48212 15.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
/bin/sh: /usr/sbin/dbverify: No such file or directory /bin/sh: /usr/sbin/dbverify: No such file or directory /bin/sh: /usr/sbin/dbverify: No such file or directory /bin/sh: /usr/sbin/dbverify: No such file or directory
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48212_test.py:70 Bind as cn=Directory Manager INFO  lib389:ticket48212_test.py:83 ######################### Import Test data (/var/lib/dirsrv/slapd-standalone1/ldif/example1k_posix.ldif) ###################### INFO  lib389:tasks.py:498 Import task import_10312020_000538 for file /var/lib/dirsrv/slapd-standalone1/ldif/example1k_posix.ldif completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:92 ######################### Add index by uidnumber ###################### INFO  lib389:ticket48212_test.py:101 ######################### reindexing... ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10312020_000540 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:106 ######################### Add nsMatchingRule ###################### INFO  lib389:ticket48212_test.py:112 ######################### reindexing... ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10312020_000545 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed INFO  lib389:ticket48212_test.py:117 ######################### Delete nsMatchingRule ###################### INFO  lib389:ticket48212_test.py:47 +++++ reindex uidnumber +++++ INFO  lib389:tasks.py:798 Index task index_attrs_10312020_000549 completed successfully INFO  lib389:ticket48212_test.py:19 +++++ dbverify +++++ INFO  lib389:ticket48212_test.py:23 Running /usr/sbin/dbverify -Z standalone1 -V INFO  lib389:ticket48212_test.py:43 dbverify passed
Passed tickets/ticket48214_test.py::test_ticket48214_run 0.33
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48214_test.py:83 Bind as cn=Directory Manager INFO  lib389:ticket48214_test.py:86 ######################### Out of Box ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:28 Empty: INFO  lib389:ticket48214_test.py:50 No nsslapd-maxbersize found in dse.ldif INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'2097152' INFO  lib389:ticket48214_test.py:72 Checking 2097152 vs 2097152 INFO  lib389:ticket48214_test.py:89 ######################### Add nsslapd-maxbersize: 0 ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:35 Right format - nsslapd-maxbersize: 0 INFO  lib389:ticket48214_test.py:52 nsslapd-maxbersize: 0 INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'2097152' INFO  lib389:ticket48214_test.py:72 Checking 2097152 vs 2097152 INFO  lib389:ticket48214_test.py:93 ######################### Add nsslapd-maxbersize: 10000 ###################### INFO  lib389:ticket48214_test.py:44 +++++ Check Max Ber Size +++++ INFO  lib389:ticket48214_test.py:18 +++++ Get maxbersize from dse.ldif +++++ INFO  lib389:ticket48214_test.py:21 Run CMD: egrep nsslapd-maxbersize /etc/dirsrv/slapd-standalone1/dse.ldif INFO  lib389:ticket48214_test.py:35 Right format - nsslapd-maxbersize: 10000 INFO  lib389:ticket48214_test.py:55 nsslapd-maxbersize: 10000 INFO  lib389:ticket48214_test.py:63 ldapsearch returned nsslapd-maxbersize: b'10000' INFO  lib389:ticket48214_test.py:98 ticket48214 was successfully verified.
Passed tickets/ticket48233_test.py::test_ticket48233 5.82
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48233_test:ticket48233_test.py:54 Test complete
Passed tickets/ticket48252_test.py::test_ticket48252_setup 0.43
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket48252_test.py::test_ticket48252_run_0 12.37
------------------------------Captured stderr call------------------------------
[31/Oct/2020:00:07:12.419523982 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [31/Oct/2020:00:07:12.427715357 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [31/Oct/2020:00:07:12.436526902 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [31/Oct/2020:00:07:12.440588917 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7476371456, process usage 23064576 [31/Oct/2020:00:07:12.448511023 -0400] - INFO - check_and_set_import_cache - Import allocates 2920457KB import cache. [31/Oct/2020:00:07:12.726305063 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: cn [31/Oct/2020:00:07:12.730313554 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [31/Oct/2020:00:07:12.742678030 -0400] - ERR - bdb_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [31/Oct/2020:00:07:12.754915846 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [31/Oct/2020:00:07:12.786560124 -0400] - INFO - bdb_pre_close - All database threads now stopped
-------------------------------Captured log call--------------------------------
INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db'] INFO  lib389:ticket48252_test.py:63 Did not found key test_user0 in dbscan output INFO  lib389:__init__.py:3014 Running script: ['/usr/bin/dbscan', '-f', '/var/lib/dirsrv/slapd-standalone1/db/userRoot/cn.db'] INFO  lib389:ticket48252_test.py:63 Did not found key test_user0 in dbscan output
Passed tickets/ticket48252_test.py::test_ticket48252_run_1 4.44
------------------------------Captured stderr call------------------------------
[31/Oct/2020:00:07:20.908288879 -0400] - INFO - slapd_exemode_db2index - Backend Instance: userRoot [31/Oct/2020:00:07:20.921374278 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [31/Oct/2020:00:07:20.930654717 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [31/Oct/2020:00:07:20.936837096 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7476297728, process usage 23162880 [31/Oct/2020:00:07:20.940809203 -0400] - INFO - check_and_set_import_cache - Import allocates 2920428KB import cache. [31/Oct/2020:00:07:21.122721101 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: objectclass [31/Oct/2020:00:07:21.131661096 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [31/Oct/2020:00:07:21.135274745 -0400] - ERR - bdb_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [31/Oct/2020:00:07:21.147719406 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [31/Oct/2020:00:07:21.174148089 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed tickets/ticket48265_test.py::test_ticket48265_test 0.65
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48265_test:ticket48265_test.py:34 Adding 20 test entries... INFO  tests.tickets.ticket48265_test:ticket48265_test.py:51 Search with Ticket 47521 type complex filter INFO  tests.tickets.ticket48265_test:ticket48265_test.py:60 Search with Ticket 48265 type complex filter INFO  tests.tickets.ticket48265_test:ticket48265_test.py:69 Test 48265 complete
Passed tickets/ticket48266_test.py::test_ticket48266_fractional 21.41
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4da1ca1d-33bb-450d-aab6-531c18a7c34b / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect e8e0af3a-1927-4a3a-8d15-291975433139 / got description=4da1ca1d-33bb-450d-aab6-531c18a7c34b) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4a7c1bc6-6161-4296-842b-a68c3b7974c5 / got description=e8e0af3a-1927-4a3a-8d15-291975433139) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working
Passed tickets/ticket48266_test.py::test_ticket48266_check_repl_desc 1.08
No log output captured.
Passed tickets/ticket48270_test.py::test_ticket48270_init 0.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48270_test.py::test_ticket48270_homeDirectory_indexed_cis 2.04
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48270_test:ticket48270_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10312020_000846 completed successfully INFO  tests.tickets.ticket48270_test:ticket48270_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48270_test.py::test_ticket48270_homeDirectory_mixed_value 0.01
No log output captured.
Passed tickets/ticket48270_test.py::test_ticket48270_extensible_search 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48270_test:ticket48270_test.py:91 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:93 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:97 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:103 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48270_test:ticket48270_test.py:110 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48272_test.py::test_ticket48272 10.15
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48272_test:ticket48272_test.py:129 Test PASSED
Passed tickets/ticket48294_test.py::test_48294_init 0.38
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Testing Ticket 48294 - Linked Attributes plug-in - won't update links after MODRDN operation INFO  lib389:ticket48294_test.py:33 ###############################################
Passed tickets/ticket48294_test.py::test_48294_run_0 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 0 - Rename employee1 and adjust the link type value by replace INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=employee2 ######################
Passed tickets/ticket48294_test.py::test_48294_run_1 0.05
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 1 - Rename employee2 and adjust the link type value by delete and add INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=employee3 ######################
Passed tickets/ticket48294_test.py::test_48294_run_2 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48294_test.py:31 ############################################### INFO  lib389:ticket48294_test.py:32 ####### Case 2 - Rename manager1 to manager2 and make sure the managed attribute value is updated INFO  lib389:ticket48294_test.py:33 ############################################### INFO  lib389:ticket48294_test.py:59 ######################### MODRDN uid=manager2 ######################
Passed tickets/ticket48295_test.py::test_48295_init 0.09
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48295_test.py:30 ############################################### INFO  lib389:ticket48295_test.py:31 ####### Testing Ticket 48295 - Entry cache is not rolled back -- Linked Attributes plug-in - wrong behaviour when adding valid and broken links INFO  lib389:ticket48295_test.py:32 ###############################################
Passed tickets/ticket48295_test.py::test_48295_run 0.25
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48295_test.py:30 ############################################### INFO  lib389:ticket48295_test.py:31 ####### Add 2 linktypes to manager1 - one exists, another does not to make sure the managed entry does not have managed type. INFO  lib389:ticket48295_test.py:32 ###############################################
Passed tickets/ticket48312_test.py::test_ticket48312 0.37
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48312_test:ticket48312_test.py:117 Test complete
Passed tickets/ticket48354_test.py::test_ticket48354 0.26
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48354_test:ticket48354_test.py:50 Test PASSED
Passed tickets/ticket48362_test.py::test_ticket48362 94.10
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 150a6d64-1b65-4c46-8bb7-e7591bedb96e / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 279a5b95-e5aa-456f-bbd2-e46bb2ff95a7 / got description=150a6d64-1b65-4c46-8bb7-e7591bedb96e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48362_test:ticket48362_test.py:28 Add dna plugin config entry...ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 INFO  tests.tickets.ticket48362_test:ticket48362_test.py:48 Enable the DNA plugin... INFO  tests.tickets.ticket48362_test:ticket48362_test.py:55 Restarting the server... INFO  tests.tickets.ticket48362_test:ticket48362_test.py:28 Add dna plugin config entry...ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  tests.tickets.ticket48362_test:ticket48362_test.py:48 Enable the DNA plugin... INFO  tests.tickets.ticket48362_test:ticket48362_test.py:55 Restarting the server... INFO  tests.tickets.ticket48362_test:ticket48362_test.py:83 ======================== Update dnaPortNum=39001 ============================ INFO  tests.tickets.ticket48362_test:ticket48362_test.py:90 ======================== Update done INFO  tests.tickets.ticket48362_test:ticket48362_test.py:83 ======================== Update dnaPortNum=39002 ============================ INFO  tests.tickets.ticket48362_test:ticket48362_test.py:90 ======================== Update done INFO  tests.tickets.ticket48362_test:ticket48362_test.py:132 ======================== BEFORE RESTART ============================ INFO  tests.tickets.ticket48362_test:ticket48362_test.py:135 ======================== BEFORE RESTART ============================ INFO  tests.tickets.ticket48362_test:ticket48362_test.py:141 ======================== BEFORE RESTART ============================ INFO  tests.tickets.ticket48362_test:ticket48362_test.py:150 =================== AFTER RESTART ================================= INFO  tests.tickets.ticket48362_test:ticket48362_test.py:153 =================== AFTER RESTART ================================= INFO  tests.tickets.ticket48362_test:ticket48362_test.py:159 =================== AFTER RESTART ================================= INFO  tests.tickets.ticket48362_test:ticket48362_test.py:162 Test complete
Passed tickets/ticket48366_test.py::test_ticket48366_init 0.54
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48366_test.py:44 Add subtree: ou=green,dc=example,dc=com INFO  lib389:ticket48366_test.py:48 Add subtree: ou=red,dc=example,dc=com INFO  lib389:ticket48366_test.py:54 Add cn=test,ou=people,dc=example,dc=com INFO  lib389:ticket48366_test.py:60 Add cn=proxy,ou=people,dc=example,dc=com INFO  lib389.utils:ticket48366_test.py:90 Adding %d test entries...
Passed tickets/ticket48366_test.py::test_ticket48366_search_user 0.10
No log output captured.
Passed tickets/ticket48366_test.py::test_ticket48366_search_dm 0.04
No log output captured.
Passed tickets/ticket48370_test.py::test_ticket48370 0.41
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48370_test:ticket48370_test.py:187 Test PASSED
Passed tickets/ticket48383_test.py::test_ticket48383 54.10
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
ldiffile: /var/lib/dirsrv/slapd-standalone1/ldif/standalone1.ldif
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket48383_test:ticket48383_test.py:62 Failed to change nsslapd-cachememsize No such object INFO  tests.tickets.ticket48383_test:ticket48383_test.py:88 Test complete
Passed tickets/ticket48497_test.py::test_ticket48497_init 0.67
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_mixed_value 0.01
No log output captured.
Passed tickets/ticket48497_test.py::test_ticket48497_extensible_search 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:49 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:51 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:55 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:61 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48497_test:ticket48497_test.py:68 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_index_cfg 0.02
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48497_test:ticket48497_test.py:73 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match
Passed tickets/ticket48497_test.py::test_ticket48497_homeDirectory_index_run 2.03
-------------------------------Captured log call--------------------------------
INFO  lib389:tasks.py:798 Index task index_attrs_10312020_001726 completed successfully INFO  tests.tickets.ticket48497_test:ticket48497_test.py:93 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48665_test.py::test_ticket48665 0.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48665_test:ticket48665_test.py:40 8 entries are returned from the server. CRITICAL tests.tickets.ticket48665_test:ticket48665_test.py:47 Failed to change nsslapd-cachememsize No such object INFO  tests.tickets.ticket48665_test:ticket48665_test.py:52 8 entries are returned from the server. INFO  tests.tickets.ticket48665_test:ticket48665_test.py:63 8 entries are returned from the server. INFO  tests.tickets.ticket48665_test:ticket48665_test.py:65 Test complete
Passed tickets/ticket48745_test.py::test_ticket48745_init 0.66
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48745_test.py::test_ticket48745_homeDirectory_indexed_cis 2.04
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48745_test:ticket48745_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10312020_001812 completed successfully INFO  tests.tickets.ticket48745_test:ticket48745_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48745_test.py::test_ticket48745_homeDirectory_mixed_value 0.01
No log output captured.
Passed tickets/ticket48745_test.py::test_ticket48745_extensible_search_after_index 0.01
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48745_test:ticket48745_test.py:91 Default: can retrieve an entry filter syntax with exact stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:99 Default: can retrieve an entry filter caseExactIA5Match with exact stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:106 Default: can not retrieve an entry filter syntax match with lowered stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:112 Default: can not retrieve an entry filter caseExactIA5Match with lowered stored value INFO  tests.tickets.ticket48745_test:ticket48745_test.py:119 Default: can retrieve an entry filter caseIgnoreIA5Match with lowered stored value
Passed tickets/ticket48746_test.py::test_ticket48746_init 0.69
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:26 Initialization: add dummy entries for the tests
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_indexed_cis 2.05
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:39 index homeDirectory in caseIgnoreIA5Match and caseExactIA5Match INFO  tests.tickets.ticket48746_test:ticket48746_test.py:57 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10312020_001827 completed successfully INFO  tests.tickets.ticket48746_test:ticket48746_test.py:63 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_mixed_value 0.05
No log output captured.
Passed tickets/ticket48746_test.py::test_ticket48746_extensible_search_after_index 0.00
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:99 Default: can retrieve an entry filter caseExactIA5Match with exact stored value
Passed tickets/ticket48746_test.py::test_ticket48746_homeDirectory_indexed_ces 2.03
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48746_test:ticket48746_test.py:104 index homeDirectory in caseExactIA5Match, this would trigger the crash INFO  tests.tickets.ticket48746_test:ticket48746_test.py:121 successfully checked that filter with exact mr , a filter with lowercase eq is failing INFO  lib389:tasks.py:798 Index task index_attrs_10312020_001830 completed successfully INFO  tests.tickets.ticket48746_test:ticket48746_test.py:127 Check indexing succeeded with a specified matching rule
Passed tickets/ticket48759_test.py::test_ticket48759 21.06
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com' INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com' INFO  lib389:ticket48759_test.py:66 !!!!!!! uid=member2,dc=example,dc=com: memberof->b'cn=group,dc=example,dc=com'
Passed tickets/ticket48799_test.py::test_ticket48799 15.35
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect c8f8f3ba-1998-4d62-ac23-be26bd9e66d3 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48799_test:ticket48799_test.py:80 Test complete
Passed tickets/ticket48844_test.py::test_ticket48844_init 0.95
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:backend.py:80 List backend with suffix=dc=bitwise,dc=com INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=TestBitw,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=TestBitw,cn=ldbm database,cn=plugins,cn=config cn: TestBitw nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/TestBitw nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=bitwise,dc=com objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=bitwise,dc=com",cn=mapping tree,cn=config cn: dc=bitwise,dc=com nsslapd-backend: TestBitw nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=dc\3Dbitwise\2Cdc\3Dcom,cn=mapping tree,cn=config cn: dc=bitwise,dc=com nsslapd-backend: TestBitw nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree
Passed tickets/ticket48844_test.py::test_ticket48844_bitwise_on 3.61
No log output captured.
Passed tickets/ticket48844_test.py::test_ticket48844_bitwise_off 4.72
No log output captured.
Passed tickets/ticket48891_test.py::test_ticket48891_setup 2.34
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48891_test.py:43 Bind as cn=Directory Manager INFO  lib389:ticket48891_test.py:52 ######################### SETUP SUFFIX o=ticket48891.org ###################### INFO  lib389:backend.py:80 List backend with suffix=dc=ticket48891.org INFO  lib389:backend.py:290 Creating a local backend INFO  lib389:backend.py:76 List backend cn=ticket48891,cn=ldbm database,cn=plugins,cn=config INFO  lib389:__init__.py:1713 Found entry dn: cn=ticket48891,cn=ldbm database,cn=plugins,cn=config cn: ticket48891 nsslapd-cachememsize: 512000 nsslapd-cachesize: -1 nsslapd-directory: /var/lib/dirsrv/slapd-standalone1/db/ticket48891 nsslapd-dncachememsize: 16777216 nsslapd-readonly: off nsslapd-require-index: off nsslapd-require-internalop-index: off nsslapd-suffix: dc=ticket48891.org objectClass: top objectClass: extensibleObject objectClass: nsBackendInstance INFO  lib389:mappingTree.py:154 Entry dn: cn="dc=ticket48891.org",cn=mapping tree,cn=config cn: dc=ticket48891.org nsslapd-backend: ticket48891 nsslapd-state: backend objectclass: top objectclass: extensibleObject objectclass: nsMappingTree INFO  lib389:__init__.py:1713 Found entry dn: cn=dc\3Dticket48891.org,cn=mapping tree,cn=config cn: dc=ticket48891.org nsslapd-backend: ticket48891 nsslapd-state: backend objectClass: top objectClass: extensibleObject objectClass: nsMappingTree INFO  lib389:ticket48891_test.py:61 ######################### Generate Test data ###################### INFO  lib389:ticket48891_test.py:77 ######################### SEARCH ALL ###################### INFO  lib389:ticket48891_test.py:78 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:ticket48891_test.py:82 Returned 10 entries. INFO  lib389:ticket48891_test.py:86 10 person entries are successfully created under dc=ticket48891.org.
Passed tickets/ticket48893_test.py::test_ticket48893 0.00
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48893_test:ticket48893_test.py:46 Test PASSED
Passed tickets/ticket48906_test.py::test_ticket48906_setup 0.16
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:63 Bind as cn=Directory Manager INFO  lib389:ticket48906_test.py:83 ######################### SEARCH ALL ###################### INFO  lib389:ticket48906_test.py:84 Bind as cn=Directory Manager and add the READ/SEARCH SELFDN aci INFO  lib389:ticket48906_test.py:88 Returned 10 entries. INFO  lib389:ticket48906_test.py:92 10 person entries are successfully created under dc=example,dc=com.
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_default 0.02
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:149 ################################### INFO  lib389:ticket48906_test.py:150 ### INFO  lib389:ticket48906_test.py:151 ### Check that before any change config/monitor INFO  lib389:ticket48906_test.py:152 ### contains the default value INFO  lib389:ticket48906_test.py:153 ### INFO  lib389:ticket48906_test.py:154 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_ldap_update 3.53
------------------------------Captured stdout call------------------------------
line locks:10000 expected_value 10000 value 10000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:160 ################################### INFO  lib389:ticket48906_test.py:161 ### INFO  lib389:ticket48906_test.py:162 ### Check that after ldap update INFO  lib389:ticket48906_test.py:163 ### - monitor contains DEFAULT INFO  lib389:ticket48906_test.py:164 ### - configured contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:165 ### - After stop dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:166 ### - After stop guardian contains DEFAULT INFO  lib389:ticket48906_test.py:167 ### In fact guardian should differ from config to recreate the env INFO  lib389:ticket48906_test.py:168 ### Check that after restart (DBenv recreated) INFO  lib389:ticket48906_test.py:169 ### - monitor contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:170 ### - configured contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:171 ### - dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:172 ### INFO  lib389:ticket48906_test.py:173 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_edit_update 6.45
------------------------------Captured stdout call------------------------------
line locks:20000 expected_value 20000 value 20000 line locks:40000 expected_value 40000 value 40000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:191 ################################### INFO  lib389:ticket48906_test.py:192 ### INFO  lib389:ticket48906_test.py:193 ### Check that after stop INFO  lib389:ticket48906_test.py:194 ### - dse.ldif contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:195 ### - guardian contains DBLOCK_LDAP_UPDATE INFO  lib389:ticket48906_test.py:196 ### Check that edit dse+restart INFO  lib389:ticket48906_test.py:197 ### - monitor contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:198 ### - configured contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:199 ### Check that after stop INFO  lib389:ticket48906_test.py:200 ### - dse.ldif contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:201 ### - guardian contains DBLOCK_EDIT_UPDATE INFO  lib389:ticket48906_test.py:202 ### INFO  lib389:ticket48906_test.py:203 ###################################
Passed tickets/ticket48906_test.py::test_ticket48906_dblock_robust 5.53
------------------------------Captured stdout call------------------------------
line locks:40000 expected_value 40000 value 40000
-------------------------------Captured log call--------------------------------
INFO  lib389:ticket48906_test.py:245 ################################### INFO  lib389:ticket48906_test.py:246 ### INFO  lib389:ticket48906_test.py:247 ### Check that the following values are rejected INFO  lib389:ticket48906_test.py:248 ### - negative value INFO  lib389:ticket48906_test.py:249 ### - insuffisant value INFO  lib389:ticket48906_test.py:250 ### - invalid value INFO  lib389:ticket48906_test.py:251 ### Check that minimum value is accepted INFO  lib389:ticket48906_test.py:252 ### INFO  lib389:ticket48906_test.py:253 ###################################
Passed tickets/ticket48944_test.py::test_ticket48944 107.25
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39202, 'ldap-secureport': 63902, 'server-id': 'consumer2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect a2d90bb6-79b9-4b5e-912f-0dfca5eab18a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 21f01e4f-94a1-477f-924c-87385431d5f8 / got description=a2d90bb6-79b9-4b5e-912f-0dfca5eab18a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:169 Joining consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 05ffe075-9364-4edc-889e-05e90be5b7d7 / got description=21f01e4f-94a1-477f-924c-87385431d5f8) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.topologies:topologies.py:169 Joining consumer consumer2 from master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is NOT working (expect a2a97215-e28e-433a-84b6-fed0d9596c91 / got description=05ffe075-9364-4edc-889e-05e90be5b7d7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is working INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 already exists INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer2 from master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 already exists INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer1 from master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.topologies:topologies.py:174 Ensuring consumer consumer2 from master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39202 is was created
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket48944_test:ticket48944_test.py:108 Ticket 48944 - On a read only replica invalid state info can accumulate INFO  tests.tickets.ticket48944_test:ticket48944_test.py:52 Enable account policy plugin and configure required attributes INFO  tests.tickets.ticket48944_test:ticket48944_test.py:54 Configure Account policy plugin on master1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:52 Enable account policy plugin and configure required attributes INFO  tests.tickets.ticket48944_test:ticket48944_test.py:54 Configure Account policy plugin on master2 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:52 Enable account policy plugin and configure required attributes INFO  tests.tickets.ticket48944_test:ticket48944_test.py:69 Configure Account policy plugin on consumer1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:52 Enable account policy plugin and configure required attributes INFO  tests.tickets.ticket48944_test:ticket48944_test.py:69 Configure Account policy plugin on consumer2 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:115 Sleep for 10secs for the server to come up INFO  tests.tickets.ticket48944_test:ticket48944_test.py:117 Add few entries to server and check if entries are replicated INFO  tests.tickets.ticket48944_test:ticket48944_test.py:132 Checking if entries are synced across masters and consumers INFO  tests.tickets.ticket48944_test:ticket48944_test.py:152 Start master2 to sync lastLoginTime attribute from master1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:155 Stop master1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:157 Bind as user1 to master2 and check if lastLoginTime attribute is greater than master1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:161 Start all servers except master1 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:167 Check if consumers are updated with lastLoginTime attribute value from master2 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:174 Check if lastLoginTime update in consumers not synced to master2 INFO  tests.tickets.ticket48944_test:ticket48944_test.py:185 Start master1 and check if its updating its older lastLoginTime attribute to consumers INFO  tests.tickets.ticket48944_test:ticket48944_test.py:194 Check if lastLoginTime update from master2 is synced to all masters and consumers INFO  tests.tickets.ticket48944_test:ticket48944_test.py:202 Checking consumer error logs for replica invalid state info
Passed tickets/ticket49008_test.py::test_ticket49008 51.40
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 52dc7f57-0681-4b43-979d-0aef0fc5fa3a / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect e4e96d62-9776-4eb0-ac59-27db276ef4c0 / got description=52dc7f57-0681-4b43-979d-0aef0fc5fa3a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f8df5ddf-0298-496c-903b-0eecdb24795e / got description=e4e96d62-9776-4eb0-ac59-27db276ef4c0) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect d54de5ce-fe18-4405-8937-1a69b10593e5 / got description=f8df5ddf-0298-496c-903b-0eecdb24795e) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49008_test:ticket49008_test.py:115 ruv before fail: b'{replica 2 ldap://localhost.localdomain:39002} 5f9ce807000000020000 5f9ce831000000020000' INFO  tests.tickets.ticket49008_test:ticket49008_test.py:116 ruv after fail: b'{replica 2 ldap://localhost.localdomain:39002} 5f9ce807000000020000 5f9ce831000000020000'
Passed tickets/ticket49020_test.py::test_ticket49020 34.50
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 27de8877-fe95-4bb0-9095-94e415a4c780 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 8aa66d20-0d7f-4269-9e57-cbe03d259953 / got description=27de8877-fe95-4bb0-9095-94e415a4c780) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect df22611b-1513-4286-b8b9-bcb000c4fe24 / got description=82fb15cb-c3cf-4cbc-9504-60d6a9f1b483) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1169 Starting total init cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config
Passed tickets/ticket49076_test.py::test_ticket49076 9.70
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49095_test.py::test_ticket49095 0.34
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49095_test:ticket49095_test.py:79 Test Passed
Passed tickets/ticket49104_test.py::test_ticket49104 0.34
------------------------------Captured stderr call------------------------------
valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory valgrind: /usr/bin/dbscan-bin: No such file or directory grep: /tmp/val49104.out: No such file or directory
Passed tickets/ticket49122_test.py::test_ticket49122 14.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: nsrole=cn=empty,dc=example,dc=com INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (nsrole=cn=empty,dc=example,dc=com) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (!(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(sn=app*))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(nsrole=cn=empty,dc=example,dc=com))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(nsrole=cn=empty,dc=example,dc=com)(sn=app*))(userpassword=*)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(objectclass=person)(sn=app*))(nsrole=cn=empty,dc=example,dc=com)) INFO  tests.tickets.ticket49122_test:ticket49122_test.py:57 Testing filter: (&(|(&(cn=*)(objectclass=person)(nsrole=cn=empty,dc=example,dc=com)))(uid=*)) INFO  lib389:ticket49122_test.py:86 Test Passed
Passed tickets/ticket49180_test.py::test_ticket49180 50.61
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master4 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39004, 'ldap-secureport': 63704, 'server-id': 'master4', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect 4e943dc7-5c01-46d3-815f-c49639f2fa92 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 73a38aa9-1da6-4a23-9611-b34840fb9fe3 / got description=4e943dc7-5c01-46d3-815f-c49639f2fa92) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 3148a58f-f80a-407b-a2fa-8d8c6fa0d07f / got description=73a38aa9-1da6-4a23-9611-b34840fb9fe3) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 3cfd569e-f1ea-4dbc-aefd-c7c82c29dd57 / got description=3148a58f-f80a-407b-a2fa-8d8c6fa0d07f) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:156 Joining master master4 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect 6b7e9eb3-66a2-4968-b912-fbaca6fac6c7 / got description=3cfd569e-f1ea-4dbc-aefd-c7c82c29dd57) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 0da62e19-efb7-4bdb-9a73-d3b3e870b388 / got description=6b7e9eb3-66a2-4968-b912-fbaca6fac6c7) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master4 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master4 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master4 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49180_test:ticket49180_test.py:78 Running test_ticket49180... INFO  tests.tickets.ticket49180_test:ticket49180_test.py:80 Check that replication works properly on all masters INFO  tests.tickets.ticket49180_test:ticket49180_test.py:95 test_clean: disable master 4... INFO  tests.tickets.ticket49180_test:ticket49180_test.py:30 test_clean: remove all the agreements to master 4... INFO  lib389:agreement.py:1095 Agreement (cn=004,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:agreement.py:1095 Agreement (cn=004,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  lib389:agreement.py:1095 Agreement (cn=004,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config) was successfully removed INFO  tests.tickets.ticket49180_test:ticket49180_test.py:46 Restoring master 4... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is NOT working (expect c48624b9-7d67-4873-8ad4-fea5d784b635 / got description=0da62e19-efb7-4bdb-9a73-d3b3e870b388) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 33dce9b1-68cc-4067-b91a-ac74281637d8 / got description=c48624b9-7d67-4873-8ad4-fea5d784b635) INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 33dce9b1-68cc-4067-b91a-ac74281637d8 / got description=c48624b9-7d67-4873-8ad4-fea5d784b635) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39004 already exists INFO  tests.tickets.ticket49180_test:ticket49180_test.py:59 Replication is working m1 -> m2. INFO  tests.tickets.ticket49180_test:ticket49180_test.py:59 Replication is working m1 -> m3. INFO  tests.tickets.ticket49180_test:ticket49180_test.py:59 Replication is working m1 -> m4. INFO  tests.tickets.ticket49180_test:ticket49180_test.py:67 Replication is working m4 -> m1. INFO  tests.tickets.ticket49180_test:ticket49180_test.py:73 Master 4 has been successfully restored. INFO  tests.tickets.ticket49180_test:ticket49180_test.py:106 Errors found on m1: 0 INFO  tests.tickets.ticket49180_test:ticket49180_test.py:111 Errors found on m2: 0 INFO  tests.tickets.ticket49180_test:ticket49180_test.py:116 Errors found on m3: 0
Passed tickets/ticket49184_test.py::test_ticket49184 5.29
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49184_test:ticket49184_test.py:89 create users and group... INFO  tests.tickets.ticket49184_test:ticket49184_test.py:39 Adding members to the group... INFO  tests.tickets.ticket49184_test:ticket49184_test.py:39 Adding members to the group...
Passed tickets/ticket49227_test.py::test_ticket49227 26.71
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49249_test.py::test_ticket49249 0.14
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49273_test.py::test_49273_corrupt_dbversion 3.68
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket49290_test.py::test_49290_range_unindexed_notes 4.63
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
------------------------------Captured stderr call------------------------------
[31/Oct/2020:00:39:14.579970204 -0400] - INFO - ldbm_instance_config_cachememsize_set - force a minimal value 512000 [31/Oct/2020:00:39:14.593815919 -0400] - INFO - bdb_instance_start - Import is running with nsslapd-db-private-import-mem on; No other process is allowed to access the database [31/Oct/2020:00:39:14.604909105 -0400] - INFO - check_and_set_import_cache - pagesize: 4096, available bytes 7466344448, process usage 22937600 [31/Oct/2020:00:39:14.612908686 -0400] - INFO - check_and_set_import_cache - Import allocates 2916540KB import cache. [31/Oct/2020:00:39:14.953178522 -0400] - INFO - bdb_db2index - userRoot: Indexing attribute: modifytimestamp [31/Oct/2020:00:39:14.968270843 -0400] - ERR - libdb - BDB1566 txn_checkpoint interface requires an environment configured for the transaction subsystem [31/Oct/2020:00:39:14.972312919 -0400] - ERR - bdb_force_checkpoint - Checkpoint FAILED, error Invalid argument (22) [31/Oct/2020:00:39:14.982550924 -0400] - INFO - bdb_db2index - userRoot: Finished indexing. [31/Oct/2020:00:39:15.012307564 -0400] - INFO - bdb_pre_close - All database threads now stopped
Passed tickets/ticket49386_test.py::test_ticket49386 32.92
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_0,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_1,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_2,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_3,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_4,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_5,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_6,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_7,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_8,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49386_test:ticket49386_test.py:28 Adding user (cn=user_9,ou=people,dc=example,dc=com): INFO  lib389:ticket49386_test.py:65 !!!!!!! cn=user_1,ou=people,dc=example,dc=com: memberof->b'cn=group_1,ou=groups,dc=example,dc=com' INFO  lib389:ticket49386_test.py:66 !!!!!!! b'cn=group_1,ou=groups,dc=example,dc=com' INFO  lib389:ticket49386_test.py:67 !!!!!!! cn=group_1,ou=groups,dc=example,dc=com INFO  tests.tickets.ticket49386_test:ticket49386_test.py:130 memberof log found: [31/Oct/2020:00:40:30.460929687 -0400] - DEBUG - memberof-plugin - memberof_postop_modrdn: Skip modrdn operation because src/dst identical cn=group_1,ou=groups,dc=example,dc=com
Passed tickets/ticket49441_test.py::test_ticket49441 8.13
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49441_test:ticket49441_test.py:35 Position ldif files, and add indexes... INFO  tests.tickets.ticket49441_test:ticket49441_test.py:52 Import LDIF with large indexed binary attributes... ERROR  lib389:tasks.py:495 Error: import task import_10312020_004116 for file /var/lib/dirsrv/slapd-standalone1/ldifbinary.ldif exited with -23 INFO  tests.tickets.ticket49441_test:ticket49441_test.py:61 Verify server is still running... INFO  tests.tickets.ticket49441_test:ticket49441_test.py:68 Test PASSED
Passed tickets/ticket49460_test.py::test_ticket_49460 15.32
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect e10c1f9d-6293-4908-8874-35d89e471127 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 48ad9f64-4ce7-424b-b592-06ccdc2db5bb / got description=e10c1f9d-6293-4908-8874-35d89e471127) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect 848fa626-0fa9-4e29-b90e-d503ca93e122 / got description=48ad9f64-4ce7-424b-b592-06ccdc2db5bb) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 765bdf43-e6ec-46c6-872d-85c6ed94a436 / got description=848fa626-0fa9-4e29-b90e-d503ca93e122) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49460_test:ticket49460_test.py:26 Adding user (cn=user11,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49460_test:ticket49460_test.py:26 Adding user (cn=user21,ou=people,dc=example,dc=com): CRITICAL tests.tickets.ticket49460_test:ticket49460_test.py:26 Adding user (cn=user31,ou=people,dc=example,dc=com):
Passed tickets/ticket49471_test.py::test_ticket49471 2.05
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49471_test:ticket49471_test.py:28 Adding user (cn=user_1,ou=people,dc=example,dc=com):
Passed tickets/ticket49540_test.py::test_ticket49540 21.44
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49540_test:ticket49540_test.py:28 index homeDirectory INFO  lib389:tasks.py:798 Index task index_attrs_10312020_004916 completed successfully INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 0 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 1 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 2 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 3 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 4 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 5 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 6 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:71 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> NO STATUS INFO  tests.tickets.ticket49540_test:ticket49540_test.py:83 =========> Great it was expected in the middle of index INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 7 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.' INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 8 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.' INFO  tests.tickets.ticket49540_test:ticket49540_test.py:62 check_task_status =========> 9 th loop INFO  tests.tickets.ticket49540_test:ticket49540_test.py:68 cn=index_attrs_10312020_004916,cn=index,cn=tasks,cn=config ---> b'userRoot: Finished indexing.'
Passed tickets/ticket49623_2_test.py::test_modrdn_loop 0.16
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology.
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket49623_2_test:ticket49623_2_test.py:64 Check the log messages for cenotaph error
Passed tickets/ticket49658_test.py::test_ticket49658_init 6.36
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master2 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39002, 'ldap-secureport': 63702, 'server-id': 'master2', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master3 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39003, 'ldap-secureport': 63703, 'server-id': 'master3', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:142 Creating replication topology. INFO  lib389.topologies:topologies.py:156 Joining master master2 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is NOT working (expect dfa39a2b-b3b3-4219-80bd-8de33d0a9e08 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect 2c3231b0-c6d0-48c8-b218-db7da3b16063 / got description=dfa39a2b-b3b3-4219-80bd-8de33d0a9e08) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 INFO  lib389.topologies:topologies.py:156 Joining master master3 to master1 ... INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is NOT working (expect f9571f4c-67ac-46b3-9c29-7f754499b52a / got description=2c3231b0-c6d0-48c8-b218-db7da3b16063) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is working INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is NOT working (expect a03ce1d6-3a83-4083-9bc5-605270dea8c3 / got description=f9571f4c-67ac-46b3-9c29-7f754499b52a) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 is working INFO  lib389.replica:replica.py:2153 SUCCESS: joined master from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master2 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master1 to master3 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master2 to master3 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 is was created INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master1 ... INFO  lib389.replica:replica.py:2338 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 already exists INFO  lib389.topologies:topologies.py:164 Ensuring master master3 to master2 ... INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39003 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39002 is was created
-------------------------------Captured log call--------------------------------
CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=0,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=0,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=1,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=1,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=2,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=2,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=3,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=3,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=4,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=4,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=5,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=5,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=6,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=6,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=7,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=7,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=8,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=8,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=9,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=9,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=10,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=10,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=11,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=11,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=12,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=12,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=13,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=13,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=14,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=14,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=15,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=15,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=16,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=16,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=17,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=17,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=18,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=18,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:54 Adding user (employeeNumber=19,ou=distinguished,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:151 Adding employeeNumber=19,ou=distinguished,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_0,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_0,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_1,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_1,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_2,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_2,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_3,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_3,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_4,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_4,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_5,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_5,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_6,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_6,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_7,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_7,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_8,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_8,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_9,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_9,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_10,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_10,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_11,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_11,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_12,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_12,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_13,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_13,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_14,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_14,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_15,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_15,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_16,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_16,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_17,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_17,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_18,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_18,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_19,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_19,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_20,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_20,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_21,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_21,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_22,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_22,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_23,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_23,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_24,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_24,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_25,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_25,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_26,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_26,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_27,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_27,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_28,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_28,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_29,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_29,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_30,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_30,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_31,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_31,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_32,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_32,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_33,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_33,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_34,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_34,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_35,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_35,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_36,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_36,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_37,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_37,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_38,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_38,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_39,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_39,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_40,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_40,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_41,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_41,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_42,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_42,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_43,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_43,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_44,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_44,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_45,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_45,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_46,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_46,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_47,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_47,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_48,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_48,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_49,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_49,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_50,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_50,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_51,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_51,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_52,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_52,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_53,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_53,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_54,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_54,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_55,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_55,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_56,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_56,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_57,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_57,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_58,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_58,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_59,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_59,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_60,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_60,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_61,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_61,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_62,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_62,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_63,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_63,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_64,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_64,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_65,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_65,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_66,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_66,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_67,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_67,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_68,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_68,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_69,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_69,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_70,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_70,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_71,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_71,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_72,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_72,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_73,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_73,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_74,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_74,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_75,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_75,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_76,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_76,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_77,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_77,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_78,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_78,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_79,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_79,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_80,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_80,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_81,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_81,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_82,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_82,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_83,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_83,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_84,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_84,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_85,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_85,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_86,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_86,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_87,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_87,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_88,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_88,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_89,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_89,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_90,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_90,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_91,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_91,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_92,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_92,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_93,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_93,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_94,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_94,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_95,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_95,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_96,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_96,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_97,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_97,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_98,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_98,ou=regular,ou=people,dc=example,dc=com on M3 CRITICAL tests.tickets.ticket49658_test:ticket49658_test.py:39 Adding user (uid=user_99,ou=regular,ou=people,dc=example,dc=com): INFO  tests.tickets.ticket49658_test:ticket49658_test.py:159 Adding uid=user_99,ou=regular,ou=people,dc=example,dc=com on M3
Passed tickets/ticket49658_test.py::test_ticket49658_0 33.60
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:289 Search M1 employeeNumber=b'0' (vs. b'0') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:295 Search M2 employeeNumber=b'0' (vs. b'0') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:303 Search M3 employeeNumber=b'0' (vs. b'0')
Passed tickets/ticket49658_test.py::test_ticket49658_1 33.63
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:395 Search M1 employeeNumber=b'1' (vs. b'1') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:401 Search M2 employeeNumber=b'1' (vs. b'1') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:409 Search M3 employeeNumber=b'1' (vs. b'1')
Passed tickets/ticket49658_test.py::test_ticket49658_2 33.61
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:501 Search M1 employeeNumber=b'2' (vs. b'2') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:507 Search M2 employeeNumber=b'2' (vs. b'2') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:515 Search M3 employeeNumber=b'2' (vs. b'2')
Passed tickets/ticket49658_test.py::test_ticket49658_3 33.64
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:607 Search M1 employeeNumber=b'3' (vs. b'3') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:613 Search M2 employeeNumber=b'3' (vs. b'3') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:621 Search M3 employeeNumber=b'3' (vs. b'3')
Passed tickets/ticket49658_test.py::test_ticket49658_4 33.63
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:713 Search M1 employeeNumber=b'4' (vs. b'4') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:719 Search M2 employeeNumber=b'4' (vs. b'4') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:727 Search M3 employeeNumber=b'4' (vs. b'4')
Passed tickets/ticket49658_test.py::test_ticket49658_5 33.60
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:819 Search M1 employeeNumber=b'5' (vs. b'5') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:825 Search M2 employeeNumber=b'5' (vs. b'5') INFO  tests.tickets.ticket49658_test:ticket49658_test.py:833 Search M3 employeeNumber=b'5' (vs. b'5')
Passed tickets/ticket49658_test.py::test_ticket49658_6 33.62
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:934 Search M1 employeeNumber=b'6.2' (vs. 6.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:940 Search M2 employeeNumber=b'6.2' (vs. 6.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:948 Search M3 employeeNumber=b'6.2' (vs. 6.2)
Passed tickets/ticket49658_test.py::test_ticket49658_7 33.49
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1042 Search M1 employeeNumber=b'7.2' (vs. 7.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1048 Search M2 employeeNumber=b'7.2' (vs. 7.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1056 Search M3 employeeNumber=b'7.2' (vs. 7.2)
Passed tickets/ticket49658_test.py::test_ticket49658_8 33.62
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1150 Search M1 employeeNumber=b'8.2' (vs. 8.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1156 Search M2 employeeNumber=b'8.2' (vs. 8.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1164 Search M3 employeeNumber=b'8.2' (vs. 8.2)
Passed tickets/ticket49658_test.py::test_ticket49658_9 33.50
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1259 Search M1 employeeNumber=b'9.2' (vs. 9.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1265 Search M2 employeeNumber=b'9.2' (vs. 9.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1273 Search M3 employeeNumber=b'9.2' (vs. 9.2)
Passed tickets/ticket49658_test.py::test_ticket49658_10 33.51
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1369 Search M1 employeeNumber=b'10.2' (vs. 10.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1375 Search M2 employeeNumber=b'10.2' (vs. 10.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1383 Search M3 employeeNumber=b'10.2' (vs. 10.2)
Passed tickets/ticket49658_test.py::test_ticket49658_11 33.63
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1479 Search M1 employeeNumber=b'11.1' (vs. 11.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1485 Search M2 employeeNumber=b'11.1' (vs. 11.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1493 Search M3 employeeNumber=b'11.1' (vs. 11.1)
Passed tickets/ticket49658_test.py::test_ticket49658_12 33.62
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1594 Search M1 employeeNumber=b'12.1' (vs. 12.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1600 Search M2 employeeNumber=b'12.1' (vs. 12.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1608 Search M3 employeeNumber=b'12.1' (vs. 12.1)
Passed tickets/ticket49658_test.py::test_ticket49658_13 33.42
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1709 Search M1 employeeNumber=b'13.1' (vs. 13.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1715 Search M2 employeeNumber=b'13.1' (vs. 13.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1723 Search M3 employeeNumber=b'13.1' (vs. 13.1)
Passed tickets/ticket49658_test.py::test_ticket49658_14 33.63
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1825 Search M1 employeeNumber=b'14.1' (vs. 14.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1831 Search M2 employeeNumber=b'14.1' (vs. 14.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1839 Search M3 employeeNumber=b'14.1' (vs. 14.1)
Passed tickets/ticket49658_test.py::test_ticket49658_15 33.61
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1940 Search M1 employeeNumber=b'15.1' (vs. 15.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1946 Search M2 employeeNumber=b'15.1' (vs. 15.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:1954 Search M3 employeeNumber=b'15.1' (vs. 15.1)
Passed tickets/ticket49658_test.py::test_ticket49658_16 37.68
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2076 Search M1 employeeNumber=b'1.1' (vs. 1.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2082 Search M2 employeeNumber=b'1.1' (vs. 1.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2090 Search M3 employeeNumber=b'1.1' (vs. 1.1)
Passed tickets/ticket49658_test.py::test_ticket49658_17 37.66
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2183 Search M1 employeeNumber=b'2.2' (vs. 2.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2189 Search M2 employeeNumber=b'2.2' (vs. 2.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2197 Search M3 employeeNumber=b'2.2' (vs. 2.2)
Passed tickets/ticket49658_test.py::test_ticket49658_18 37.67
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2289 Search M1 employeeNumber=b'3.2' (vs. 3.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2295 Search M2 employeeNumber=b'3.2' (vs. 3.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2303 Search M3 employeeNumber=b'3.2' (vs. 3.2)
Passed tickets/ticket49658_test.py::test_ticket49658_19 38.67
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2405 Search M1 employeeNumber=b'4.1' (vs. 4.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2411 Search M2 employeeNumber=b'4.1' (vs. 4.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2419 Search M3 employeeNumber=b'4.1' (vs. 4.1)
Passed tickets/ticket49658_test.py::test_ticket49658_20 38.71
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2520 Search M1 employeeNumber=b'5.1' (vs. 5.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2526 Search M2 employeeNumber=b'5.1' (vs. 5.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2534 Search M3 employeeNumber=b'5.1' (vs. 5.1)
Passed tickets/ticket49658_test.py::test_ticket49658_21 38.57
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2635 Search M1 employeeNumber=b'6.1' (vs. 6.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2641 Search M2 employeeNumber=b'6.1' (vs. 6.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2649 Search M3 employeeNumber=b'6.1' (vs. 6.1)
Passed tickets/ticket49658_test.py::test_ticket49658_22 38.57
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2750 Search M1 employeeNumber=b'7.1' (vs. 7.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2756 Search M2 employeeNumber=b'7.1' (vs. 7.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2764 Search M3 employeeNumber=b'7.1' (vs. 7.1)
Passed tickets/ticket49658_test.py::test_ticket49658_23 39.69
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2873 Search M1 employeeNumber=b'8.2' (vs. 8.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2879 Search M2 employeeNumber=b'8.2' (vs. 8.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2887 Search M3 employeeNumber=b'8.2' (vs. 8.2)
Passed tickets/ticket49658_test.py::test_ticket49658_24 39.75
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:2996 Search M1 employeeNumber=b'9.2' (vs. 9.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3002 Search M2 employeeNumber=b'9.2' (vs. 9.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3010 Search M3 employeeNumber=b'9.2' (vs. 9.2)
Passed tickets/ticket49658_test.py::test_ticket49658_25 39.62
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3119 Search M1 employeeNumber=b'10.2' (vs. 10.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3125 Search M2 employeeNumber=b'10.2' (vs. 10.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3133 Search M3 employeeNumber=b'10.2' (vs. 10.2)
Passed tickets/ticket49658_test.py::test_ticket49658_26 39.79
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3242 Search M1 employeeNumber=b'11.2' (vs. 11.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3248 Search M2 employeeNumber=b'11.2' (vs. 11.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3256 Search M3 employeeNumber=b'11.2' (vs. 11.2)
Passed tickets/ticket49658_test.py::test_ticket49658_27 39.61
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3365 Search M1 employeeNumber=b'12.2' (vs. 12.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3371 Search M2 employeeNumber=b'12.2' (vs. 12.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3379 Search M3 employeeNumber=b'12.2' (vs. 12.2)
Passed tickets/ticket49658_test.py::test_ticket49658_28 39.76
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3488 Search M1 employeeNumber=b'13.2' (vs. 13.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3494 Search M2 employeeNumber=b'13.2' (vs. 13.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3502 Search M3 employeeNumber=b'13.2' (vs. 13.2)
Passed tickets/ticket49658_test.py::test_ticket49658_29 39.60
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3612 Search M1 employeeNumber=b'14.2' (vs. 14.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3618 Search M2 employeeNumber=b'14.2' (vs. 14.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3626 Search M3 employeeNumber=b'14.2' (vs. 14.2)
Passed tickets/ticket49658_test.py::test_ticket49658_30 39.60
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3735 Search M1 employeeNumber=b'15.2' (vs. 15.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3741 Search M2 employeeNumber=b'15.2' (vs. 15.2) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3749 Search M3 employeeNumber=b'15.2' (vs. 15.2)
Passed tickets/ticket49658_test.py::test_ticket49658_31 40.75
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3866 Search M1 employeeNumber=b'16.1' (vs. 16.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3872 Search M2 employeeNumber=b'16.1' (vs. 16.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3880 Search M3 employeeNumber=b'16.1' (vs. 16.1)
Passed tickets/ticket49658_test.py::test_ticket49658_32 40.71
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:3998 Search M1 employeeNumber=b'17.1' (vs. 17.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4004 Search M2 employeeNumber=b'17.1' (vs. 17.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4012 Search M3 employeeNumber=b'17.1' (vs. 17.1)
Passed tickets/ticket49658_test.py::test_ticket49658_33 40.73
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4122 Search M1 employeeNumber=b'18.1' (vs. 18.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4128 Search M2 employeeNumber=b'18.1' (vs. 18.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4136 Search M3 employeeNumber=b'18.1' (vs. 18.1)
Passed tickets/ticket49658_test.py::test_ticket49658_34 40.95
-------------------------------Captured log call--------------------------------
INFO  lib389:agreement.py:1194 Pausing replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1194 Pausing replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=002,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=001,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  lib389:agreement.py:1220 Resuming replication cn=003,cn=replica,cn=dc\3Dexample\2Cdc\3Dcom,cn=mapping tree,cn=config INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4246 Search M1 employeeNumber=b'19.1' (vs. 19.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4252 Search M2 employeeNumber=b'19.1' (vs. 19.1) INFO  tests.tickets.ticket49658_test:ticket49658_test.py:4260 Search M3 employeeNumber=b'19.1' (vs. 19.1)
Passed tickets/ticket49788_test.py::test_ticket49781 0.19
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.
Passed tickets/ticket50078_test.py::test_ticket50078 4.18
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for master1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39001, 'ldap-secureport': 63701, 'server-id': 'master1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for hub1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39101, 'ldap-secureport': 63801, 'server-id': 'hub1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for consumer1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 39201, 'ldap-secureport': 63901, 'server-id': 'consumer1', 'suffix': 'dc=example,dc=com'} was created. INFO  lib389.topologies:topologies.py:524 Creating replication topology. INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is was created INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is NOT working (expect 2d92714c-bc7c-4512-b582-b0a0979c5904 / got description=None) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 is working INFO  lib389.replica:replica.py:2211 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 INFO  lib389.replica:replica.py:2084 SUCCESS: bootstrap to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 completed INFO  lib389.replica:replica.py:2365 SUCCESS: Agreement from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is was created INFO  lib389.replica:replica.py:2268 SUCCESS: joined consumer from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39101 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 INFO  lib389.replica:replica.py:2498 Retry: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is NOT working (expect 3424515a-74b1-4420-b546-6e1cf272a598 / got description=2d92714c-bc7c-4512-b582-b0a0979c5904) INFO  lib389.replica:replica.py:2496 SUCCESS: Replication from ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39001 to ldap://ci-vm-10-0-139-146.hosted.upshift.rdu2.redhat.com:39201 is working
-------------------------------Captured log call--------------------------------
INFO  tests.tickets.ticket50078_test:ticket50078_test.py:31 Replication is working. INFO  tests.tickets.ticket50078_test:ticket50078_test.py:47 Rename the test entry test_user1... INFO  tests.tickets.ticket50078_test:ticket50078_test.py:52 Replication is working.
Passed tickets/ticket50234_test.py::test_ticket50234 0.66
-------------------------------Captured log setup-------------------------------
INFO  lib389.SetupDs:setup.py:658 Starting installation... INFO  lib389.SetupDs:setup.py:686 Completed installation for standalone1 INFO  lib389.topologies:topologies.py:109 Instance with parameters {'ldap-port': 38901, 'ldap-secureport': 63601, 'server-id': 'standalone1', 'suffix': 'dc=example,dc=com'} was created.