| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
| |
* tester/run: s/$@/$*/.
($DTF_RESULT_SUCCES): Rename to $DTF_RESULT_SUCCESS.
(info): s/SUCCES/SUCCESS/.
|
|
|
|
|
|
|
|
|
|
| |
Do not try to put some output into YML file when we are only
listing. That could cause 'Permision denied' errors even if the
permission is actually not needed.
* tester/run (__result_yml_print): New local function.
(die, run_test): Call __result_yml_print instead of hand touching
the yml file.
|
|
|
|
|
|
| |
* controller/bin/dtf-run-remote.in (tarball): First process the
$opt_taskdir with readlink -f to get some real path (instead of
'.' e.g.) and then call basename.
|
|
|
|
|
|
| |
* lib_pgsql.sh (dtf_postgresql_phase_cleanup): Remove all logs.
(dtf_postgresql_checkphase): Fail if some logs are present.
* tasks/upgrade/locale/changed/runtest.sh: New file.
|
|
|
|
|
|
|
|
| |
* tasks/initdb: Move to tasks/initdb/basic.
* tasks/initdb_old: Move to tasks/initdb/old-syntax.
* tasks/upgrade-basic: Moved to tasks/upgrade/basic.
* tasks/upgrade-locale-utf8-syntax: Moved to
tasks/upgrade/locale/utf8-syntax.
|
|
|
|
|
|
|
| |
* postgresql-tests/tasks/upgrade-basic/runtest.sh: Sync the test
name with directory name.
* postgresql-tests/tasks/upgrade-utf8-syntax/runtest.sh: Rename to
postgresql-tests/tasks/upgrade-locale-utf8-syntax/runtest.sh.
|
|
|
|
| |
* controller/README: Reword something, typo-fixes.
|
|
|
|
|
|
|
|
|
|
|
|
| |
* README: Made as symlink to controller/README.
* controller/Makefile.am: Distribute the README and configuration
file template.
* controller/README: Reworked version of README following current
API.
* controller/doc/dtf-controller/OSID.sh.template: New
configuration template.
* tester/dtf-prepare-testsuite: Prepare also basic taskdir
structure.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Copied from 'postgresql-setup' package. Also do some 'make dist'
fixes.
* controller/Makefile.am: Use $TEST_GEN_FILES_LIST. Also create
the share/ directory during build.
* controller/configure.ac: Initialize testsuite.
* controller/tests/Makefile.am: Bureaucracy for testsuite.
* controller/tests/atlocal.in: Likewise.
* controller/tests/testsuite.at: Add two tests copied from
postgresql-setup project.
|
|
|
|
|
| |
* Makefile.am: Make sure all sources and data files are
distributed.
|
|
|
|
|
|
| |
* tester/run (die): Do not throw 'permission denied' message if
the die was called too early (and the result dir is still not
created).
|
|
|
|
|
|
| |
* bin/dtf-get-machine.in: Exit if option was parsed but was not
handled explicitly by case statement.
* bin/dtf-run-remote.in: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
New script 'dtf-return-machine' returns the VM to OpenStack based
on its public IP. In future, this may be abstracted to any VM
provider (or VM pool or whatever), but that requires also some
IP <=> VM mapping shared between dtf-get-machine and
dtf-return-machine.
* controller/.gitignore: Ignore new scripts.
* controller/Makefile.am: Build new scripts.
* controller/bin/dtf-return-machine.in: New script for VM
* deletion.
* controller/libexec/dtf-nova.in: New wrapper around 'nova'
command, showing only data output where fields are separated by
tabulator.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Finally call dtf-return-machine after successful test run.
|
|
|
|
| |
* controller/bin/dtf-controller.in: Print '\n' after error msg.
|
|
|
|
|
|
|
|
|
|
| |
While polling sshd server on remote host, do not use
PasswordAuthentication even if the server allows that. That
causes problem if the new VM already started but cloud-init was
not-yet able to set the authorized_keys file.
* controller/libexec/dtf-wait-for-ssh: Add the
PasswordAuthentication=no ssh option.
|
|
|
|
| |
* controller/libexec/dtf-wait-for-ssh: Syntax lint.
|
|
|
|
|
| |
* controller/bin/dtf-get-machine.in: Do not try to check for IP
address if 'nova boot' command failed.
|
|
|
|
| |
* controller/controller: Removed.
|
|
|
|
|
|
|
| |
* controller/bin/dtf-controller.in (child_task): Do not exit if
dtf-run-remote failed. This allows us commit at least log files.
* controller/libexec/dtf-commit-results.in: Do not try to extract
dtf.tar.gz archive if it does not exist (dtf-run-remote fail).
|
|
|
|
|
| |
* share/dtf-controller/results-stats-templates/html.tmpl: Fields
of the result table are now hyper-linked with particular results.
|
|
|
|
|
| |
* controller/bin/dtf-controller.in: Really generate html instead
of xml.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* controller/bin/dtf-controller.in (subcommand): Generate
stdout and stderr files separately.
(child_task): Generate '*.err' and '*.out' logs for subcommands.
Call dtf-run-remote with --distro/--distro-version options. Call
the dtf-result-stats finally and save its output to results.html.
(main): Simple debugging info and comment adjusting.
* controller/libexec/dtf-commit-results.in: Tak three arguments
now.
* controller/libexec/dtf-result-stats.in: Better read the
'tester/run' output.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Run the 'run --force' instead of 'run' on remote host.
* controller/share/dtf-controller/results-stats-templates/html.tmpl
React on exit_status 2.
|
|
|
|
|
| |
* tester/run: Define 0, 1 and 2 exit status. Also handle SKIP
test-case exit status.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
.. rather than by ansible nova_compute module directly. Allows me
implement more variability in VM handling.
* controller/bin/dtf-get-machine.in: Add --quiet option which
causes that only allocated IP is shown. Add also
DTF_GET_MACHINE_FAKE_IP variable usable for faster debugging; when
set, dtf-get-machine prints its content to standard output without
allocating new VM.
* controller/bin/dtf-run-remote.in: Add -v (verbose) option to
ansible-playbook call to get more verbose output.
* controller/share/dtf-controller/ansible/playbooks/fedora.yml:
Use dtf-get-machine. Also remove creds file requirement.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Controller is able to read simple YAML configuration file with
list of task to be performed in parallel (the task actually are
run the testsuite remotely, commit results to DB, count
statistics and upload results).
* controller/bin/dtf-controller.in: New template for binary.
* controller/libexec/dtf-commit-results.in: Copy whole result
directory instead of 'dtf' subdir only.
* controller/.gitignore: Ignore new binary.
* controller/Makefile.am: Build dtf-commit-results.
|
|
|
|
|
| |
* tester/run: Handle the "default" return value of 'run()' by
throwing '[ FAIL ]' string to stdout.
|
|
|
|
| |
* postgresql-tests/config.sh: Use office accessible IP.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Define test return values and its calling. Do not split the
testcase into configuration and running script, rather use one
file and wrap the script by run() method. This is still very easy
to run without running whole testsuite.
* tester/run (DTF_RESULT_*): Return values API.
(run): Rename to run_test. Make the function more readable, don't
generate xml results (not yet used anyway).
* postgresql-tests/config.sh: Do not source the per-testsuite
library directly as the configuration script config.h is sourced
even by 'run' script itself for --dist option (for that action we
actually do not need per-testsuite libraries).
* postgresql-tests/tasks/initdb/runtest.sh: New API used.
* postgresql-tests/tasks/initdb_old/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-basic/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-utf8-syntax/runtest.sh: Likewise.
* postgresql-tests/tasks/upgrade-basic/config.sh: Remove.
* postgresql-tests/tasks/initdb/config.sh: Remove.
* postgresql-tests/tasks/initdb_old/config.sh: Remove.
* postgresql-tests/tasks/upgrade-utf8-syntax/config.sh: Remove.
|
|
|
|
|
|
| |
I had badly configured git so I missed those before.
* controller/.gitignore: Add autoconf/automake related ignores.
|
|
|
|
|
| |
* controller/bin/dtf-get-machine.in: Use $HOME/.dtf/.. rather than
$srcdir/config/...
|
|
|
|
|
|
|
|
| |
The directory structure in this project is done so that you can
run directly from dir (after ./build).
* controller/build: Use --prefix="$(pwd)" instead of --with-git
which was never implemented.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Move 'results_stats' and 'commit_results' binaries into libexec
and adjust appropriately. Also html template is now in
$pkgdatadir.
* .gitignore: Add tags/ChangeLog generated files.
* README: Just some random notes. Needs to be rewritten anyway.
* controller/.gitignore: Add newly 'make'd files.
* controller/Makefile.am: Generate libexec/bin files.
* controller/commit_results: Move to controller/libexec as
dtf-result-stats.in.
* controller/configure.ac: Also substitute resulttemplatedir.
* controller/etc/dtf.conf.d/config.sh.template: The DTF_DATABASE
was misleading - use rather DTF_DATABASE_DEFAULT.
* controller/libexec/dtf-commit-results.in: Moved from
controller/commit_results.
* controller/result_stats: Moved to
controller/libexec/dtf-result-stats.in.
* controller/libexec/dtf-result-stats.in: Moved from
controller/result_stats.
* controller/result_templates/html.tmpl: Moved to
controller/share/dtf-controller/results-stats-templates/html.tmpl.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Rename the config vairable from DTF_OPENSTACK_ID to
DTF_OPENSTACK_DEFAULT_ID to better match the name with its
purpose.
* controller/bin/dtf-run-remote.in: Use DTF_OPENSTACK_DEFAULT_ID
instead of DTF_OPENSTACK_ID.
* controller/config/config.sh.template: Moved.
* controller/etc/dtf.conf.d/config.sh.template: Document renamed
variable on new place.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
First part of converting controller to autoconf/automake solution.
* .gitignore: New gitignore; autotools ignores.
* Makefile.am: New file.
* get_machine: Renamed to template bin/dtf-get-machine.in.
* bin/dtf-get-machine.in: New template based on get_machine.
* run_remote: Renamed to template bin/dtf-run-remote.in.
* bin/dtf-run-remote.in: New binary template from run_remote.
* build: New bootstrap like helper script (git-only).
* configure.ac: New file.
* etc/dtf.sh.in: Likewise.
* ansible_helpers/wait-for-ssh: Renamed to
libexec/dtf-wait-for-ssh.
* share/dtf-controller/parse_credsfile: Reworked script for
parsing OS credentials.
* parse_credsfile: Moved to share/dtf-controller.
* libexec/dtf-wait-for-ssh: Renamed from wait-for-ssh.
* ansible/*: Moved into share/dtf-controller/ansible/*.
* share/dtf-controller/ansible/vars/generated-vars.yml.in: New
template file exporting configure-time variables into playbooks.
|
|
|
|
| |
* get_machine: New option --name and variable $opt_name.
|
|
|
|
|
|
|
| |
.. to reuse remotely generated results and commit them to local
controller-database.
* commit_results: New script.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Turns out that our OpenStack has poor connectivity to Brno so I
added data mirror on one VM on OS1.
* postgresql-tests/lib_pgsql.sh (dtf_postgresql_data_mirror): New
PostgreSQL related API.
* postgresql-tests/config.sh ($dtf_dataurl): Removed PG API
variable.
($dtf_dataurls): Substitution for $dtf_dataurl; Array of possible
data URLs.
* postgresql-tests/tasks/upgrade-basic/runtest.sh: Reuse new PG
API^.
* postgresql-tests/tasks/upgrade-utf8-syntax/runtest.sh: Likewise.
|
|
|
|
|
| |
* controller/parse_credsfile: Detect $srcdir to be able to read
the correct secret file from any CWD.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Make sure that on tester machine everything is put into
$DTF_RESULTDIR. Similarly, on controller machine, everything
should be put into --workdir.
* controller/run_remote: Detect $srcdir.
(workdir_prereq): The $opt_workdir is temporary directory by
default.
* tester/run (run): Task results now go into $DTF_RESULTDIR/tasks.
The main xml result goes into $DTF_RESULT/dtf.xml.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Try to split into three separate components -> controller, tester,
and 'tasks' (postgresql-tasks in our case). The controller
component is the main part which is able to run the task remotely.
Tester is more-like library for 'tasks' component (should be
reusable on the raw git level).
* controller: Almost separated component.
* postgresql-tasks: Likewise.
* tester: Likewise.
|
|
|
|
|
|
| |
.. as the default 180s seems to often not sufficient.
* ansible/fedora.yml: Add wait_for=600 to nova_compute.
|
|
|
|
| |
* tasks/initdb/runtests.sh: Skip this test if performed on f21-.
|
|
|
|
|
|
|
| |
.. because data for x86_64 servers are already generated.
* lib_pgsql.sh (dtf_postgresql_upgrade_matrix): Testing upgrade
from f21 and f22 is now possible.
|
|
|
|
|
| |
* config/config.sh.template: Variable description adjusted.
* config/os/EXAMPLE.sh: Likewise.
|
|
|
|
|
| |
* runner/result_templates/html.tmpl: When particular task results
are not available, print NOT AVAILABLE instead of FAIL.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
For that purpose you may use '--extra-rpms-file FILE' option where
the file contains list of rpms to be installed. The RPMs should
be defined like accessible URLs from testing machine.
Also, export the overall log file as dtf-run.overview.
* run_remote: Add new option --extra-rpms-file.
* ansible/fedora.yml: Include conditionally the
additional-packages playbook. Generate the dtf-run.overview.
* ansible/include/additional-packages.yml: New playbook used to
install explicit list of additional packages.
* ansible/include/download-results.yml: Fix to download also
dtf-run.overview file.
|
|
|
|
|
| |
* lib.sh: Mostly fix double-quoting and back-tick removal.
* lib_pgsql.sh: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When system's locale changed e.g. from en_US.utf8 to en_US.UTF-8,
older PostgreSQL versions were unable to upgrade the data
directory. From Fedora 20 we should be able to upgrade without
issues.
Related: #1007802
* lib_pgsql.sh (dtf_postgresql_cb_upgrade)
(dtf_postgresql_cb_upgrade_select): New callbacks for
dtf_postgresql_upgrade_tour function.
(dtf_postgresql_upgrade_tour): Function determining against which
data the installation is able to upgrade and performs all the
possible upgrade scenarios.
* tasks/upgrade-basic/runtest.sh: Switch to
dtf_postgresql_upgrade_tour usage.
* tasks/upgrade-utf8-syntax/config.sh: New testcase config.
* tasks/upgrade-utf8-syntax/runtest.sh: New testcase.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Based on pre-generated tarball with PostgreSQL data - download the
tarball, unpack and perform 'postgresql-setup upgrade'.
* lib_pgsql.sh (dtf_postgresql_unpack_remote_data): New function.
(dtf_postgresql_upgrade_matrix): New function. Detect which data
should we test against.
* run: Define new global $dtf_dataurl.
* tasks/upgrade-basic/config.sh: New testcase config.
* tasks/upgrade-basic/runtest.sh: New testcase.
|
|
|
|
|
| |
* parse_credsfile: Use export to correctly propagate values which
are parsed.
|