diff options
| author | Luke Kaines <luke@puppetlabs.com> | 2011-06-07 15:58:26 -0700 |
|---|---|---|
| committer | Daniel Pittman <daniel@puppetlabs.com> | 2011-06-07 15:58:26 -0700 |
| commit | 48aafa3a5c553fec7b9a5193febd5591937d5680 (patch) | |
| tree | 8566b7ba9bf4ee47de760c154bf77c7de73e0813 /lib/puppet | |
| parent | 368b516d39ec36222421c09ac5c592e0a1ff7126 (diff) | |
(#6873) Add Static Compiler terminus to 2.7.0
This adds the static compiler terminus to the release. This wraps the default
compiler terminus, and post-processes the catalog to rewrite every file
reference using the 'puppet://' URI/protocol into filebucket references that
use the MD5 checksum of the file contents.
This provides a genuinely static catalog, in terms of content: there are no
external dependencies that can change to make the behaviour of applying this
catalog change.
It also eliminates the describe calls from file checking, as all the metadata
is stored locally in the catalog. This can be a substantial performance
increase for nodes, especially those that manage large trees of recursive
files.
To use this set the `catalog_terminus` to `static_compiler`; the resultant
catalog will then reference only static content. This does not, however, put
the required files into the filebucket on the client.
There are some limitations of this code:
* Files are all read into memory rather than streamed. This will definitely
cause problems with large files, but the filebucket doesn't currently
handle streaming.
* We think the recursion behavior is equivalent, but can't really guarantee
it without a good bit of testing.
* You have to populate the client filebucket manually. We don't have any
support for doing this automatically, not even through variant access to
the catalog downloader.
* Behavior on the server is currently undefined if your puppet masters are
behind a load balancer and they're configured to do fileserving through
that load balancer. It should work, but it probably won't be that fast.
You can see https://github.com/lak/puppet-static-compiler for the original
prototype this was inherited from, which includes some example code for
scanning the downloaded catalog and fetching resources into the filebucket.
Reviewed-By: Daniel Pittman <daniel@puppetlabs.com>
Diffstat (limited to 'lib/puppet')
| -rw-r--r-- | lib/puppet/indirector/catalog/static_compiler.rb | 137 |
1 files changed, 137 insertions, 0 deletions
diff --git a/lib/puppet/indirector/catalog/static_compiler.rb b/lib/puppet/indirector/catalog/static_compiler.rb new file mode 100644 index 000000000..1d92121ed --- /dev/null +++ b/lib/puppet/indirector/catalog/static_compiler.rb @@ -0,0 +1,137 @@ +require 'puppet/node' +require 'puppet/resource/catalog' +require 'puppet/indirector/code' + +class Puppet::Resource::Catalog::StaticCompiler < Puppet::Indirector::Code + def compiler + @compiler ||= indirection.terminus(:compiler) + end + + def find(request) + return nil unless catalog = compiler.find(request) + + raise "Did not get catalog back" unless catalog.is_a?(model) + + catalog.resources.find_all { |res| res.type == "File" }.each do |resource| + next unless source = resource[:source] + next unless source =~ /^puppet:/ + + file = resource.to_ral + if file.recurse? + add_children(request.key, catalog, resource, file) + else + find_and_replace_metadata(request.key, resource, file) + end + end + + catalog + end + + def find_and_replace_metadata(host, resource, file) + # We remove URL info from it, so it forces a local copy + # rather than routing through the network. + # Weird, but true. + newsource = file[:source][0].sub("puppet:///", "") + file[:source][0] = newsource + + raise "Could not get metadata for #{resource[:source]}" unless metadata = file.parameter(:source).metadata + + replace_metadata(host, resource, metadata) + end + + def replace_metadata(host, resource, metadata) + [:mode, :owner, :group].each do |param| + resource[param] ||= metadata.send(param) + end + + resource[:ensure] = metadata.ftype + if metadata.ftype == "file" + unless resource[:content] + resource[:content] = metadata.checksum + resource[:checksum] = metadata.checksum_type + end + end + + store_content(resource) if resource[:ensure] == "file" + old_source = resource.delete(:source) + Puppet.info "Metadata for #{resource} in catalog for '#{host}' added from '#{old_source}'" + end + + def add_children(host, catalog, resource, file) + file = resource.to_ral + + children = get_child_resources(host, catalog, resource, file) + + remove_existing_resources(children, catalog) + + children.each do |name, res| + catalog.add_resource res + catalog.add_edge(resource, res) + end + end + + def get_child_resources(host, catalog, resource, file) + sourceselect = file[:sourceselect] + children = {} + + source = resource[:source] + + # This is largely a copy of recurse_remote in File + total = file[:source].collect do |source| + next unless result = file.perform_recursion(source) + return if top = result.find { |r| r.relative_path == "." } and top.ftype != "directory" + result.each { |data| data.source = "#{source}/#{data.relative_path}" } + break result if result and ! result.empty? and sourceselect == :first + result + end.flatten + + # This only happens if we have sourceselect == :all + unless sourceselect == :first + found = [] + total.reject! do |data| + result = found.include?(data.relative_path) + found << data.relative_path unless found.include?(data.relative_path) + result + end + end + + total.each do |meta| + # This is the top-level parent directory + if meta.relative_path == "." + replace_metadata(host, resource, meta) + next + end + children[meta.relative_path] ||= Puppet::Resource.new(:file, File.join(file[:path], meta.relative_path)) + + # I think this is safe since it's a URL, not an actual file + children[meta.relative_path][:source] = source + "/" + meta.relative_path + replace_metadata(host, children[meta.relative_path], meta) + end + + children + end + + def remove_existing_resources(children, catalog) + existing_names = catalog.resources.collect { |r| r.to_s } + + both = (existing_names & children.keys).inject({}) { |hash, name| hash[name] = true; hash } + + both.each { |name| children.delete(name) } + end + + def store_content(resource) + @summer ||= Object.new + @summer.extend(Puppet::Util::Checksums) + + type = @summer.sumtype(resource[:content]) + sum = @summer.sumdata(resource[:content]) + + if Puppet::FileBucket::File.indirection.find("#{type}/#{sum}") + Puppet.info "Content for '#{resource[:source]}' already exists" + else + Puppet.info "Storing content for source '#{resource[:source]}'" + content = Puppet::FileServing::Content.find(resource[:source]) + Puppet::FileBucket::File.new(content.content).save + end + end +end |
