Compare commits

..

89 Commits

Author SHA1 Message Date
Donal McBreen
aa2ceaa92a Bump version for 2.7.0 2025-06-18 10:27:00 +01:00
Donal McBreen
c3e7721da5 Bump version for 2025-06-18 10:24:55 +01:00
Donal McBreen
0656e02375 Doc update from @acidtib in https://github.com/basecamp/kamal-site/pull/174 2025-06-17 15:42:15 +01:00
Donal McBreen
aed77a78fb Formatting fixes for docs 2025-06-17 15:34:27 +01:00
Donal McBreen
9244247389 Merge pull request #1584 from basecamp/all-fields-one-password-refactor
OnePassword adapter refactor
2025-06-17 11:52:20 +01:00
Donal McBreen
6e517665e8 OnePassword adapter refactor
- fix rubocop offenses
- extract fields_map
- no early return
- include fields in error message
2025-06-17 11:37:30 +01:00
Donal McBreen
4b0afdf42b Merge pull request #1567 from capripot/add_all_fields_one_password_retrieval
feat: Add allowing retrieving all fields for an item
2025-06-17 11:22:10 +01:00
Donal McBreen
5aa3f7bd4c Merge pull request #1583 from basecamp/custom-ssl-per-role
Custom certs per role
2025-06-17 11:12:21 +01:00
Donal McBreen
ccbcbbc8c5 Custom certs per role
- Upload the cert with `sshkit.upload!`
- Use the role name to create a directory for each role's certs
- Add an integration test for the custom certs
2025-06-17 10:26:57 +01:00
Donal McBreen
8a7260d1e9 Merge pull request #1531 from acidtib/feat/custom-ssl
feat: Add support for custom certificates
2025-06-17 09:25:15 +01:00
Donal McBreen
89c56910c9 Merge pull request #1551 from ACPK/kamal-proxy-path-prefix
Add support for kamal-proxy's path-prefix
2025-06-16 11:07:23 +01:00
Donal McBreen
52e06c1351 Merge pull request #1570 from nickcoyne/bws-secrets
Request Bitwarden Secrets Manager secrets as JSON
2025-06-16 10:59:53 +01:00
Donal McBreen
9bcc953cd6 Stub bws project list correctly 2025-06-16 10:58:57 +01:00
Donal McBreen
e2015b47f9 Merge pull request #1422 from acidtib/feat/secrets-add-passbolt-adapter
feat(secrets): add Passbolt adapter
2025-06-16 09:14:07 +01:00
Donal McBreen
23f2bf71f9 Fix rubocop whitespace issues 2025-06-16 09:00:04 +01:00
Donal McBreen
054a85d3c0 Merge pull request #916 from nickhammond/buildpacks
Add pack option to the builder options for cloud native buildpacks
2025-06-16 08:57:27 +01:00
Donal McBreen
5a0da160b4 Merge pull request #1440 from ursm/bws
Fix Bitwarden Secrets Manager authentication checks
2025-06-16 08:56:24 +01:00
Donal McBreen
72d9fcbaaa Merge pull request #1579 from basecamp/dependabot/bundler/bundler-b051ec43b1
Bump rack from 3.1.14 to 3.1.16 in the bundler group across 1 directory
2025-06-16 07:52:26 +01:00
Donal McBreen
a201a6ca68 Merge pull request #1544 from prullmann/kamal-exec-piping
Allow piping into kamal exec #1485
2025-06-16 07:52:03 +01:00
dependabot[bot]
1d81d9ec15 Bump rack from 3.1.14 to 3.1.16 in the bundler group across 1 directory
Bumps the bundler group with 1 update in the / directory: [rack](https://github.com/rack/rack).


Updates `rack` from 3.1.14 to 3.1.16
- [Release notes](https://github.com/rack/rack/releases)
- [Changelog](https://github.com/rack/rack/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rack/rack/compare/v3.1.14...v3.1.16)

---
updated-dependencies:
- dependency-name: rack
  dependency-version: 3.1.16
  dependency-type: indirect
  dependency-group: bundler
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-12 08:05:12 +00:00
Donal McBreen
aa67564dc5 Merge pull request #1543 from basecamp/dependabot/bundler/bundler-457f06d3c7
Bump rack-session from 2.0.0 to 2.1.1 in the bundler group across 1 directory
2025-06-12 09:04:24 +01:00
Donal McBreen
fd6ac4f84b Merge pull request #1539 from miguno/issue-1538
Fix: correctly parse git remote origin urls for calling Octokit
2025-06-12 09:04:07 +01:00
Donal McBreen
c8f232b64f Merge pull request #1541 from polarctos/install-docker-cli-only
Install only docker-cli for 30% smaller kamal docker image
2025-06-12 09:03:18 +01:00
Donal McBreen
7f3dd59a73 Merge pull request #1576 from nickhammond/validate-labels
Validate destination, role, and service are not set as labels on roles and accessories
2025-06-11 08:13:47 +01:00
Nick Hammond
6672e3e77d Remove blank line 2025-06-09 19:47:26 -07:00
Nick Hammond
b164d50ff1 Check for label presence in the validation, don't validate labels on simple role setup 2025-06-09 19:36:27 -07:00
Nick Hammond
1d88281fee Validate that destination, role, and service are not set as labels on roles and accessories 2025-06-09 19:08:20 -07:00
Nick Coyne
a004232ffc Request secrets as json 2025-06-02 09:06:05 +12:00
Nick Hammond
487aa306c9 Merge branch 'basecamp:main' into buildpacks 2025-05-23 10:59:05 -07:00
capripot
cbf94fa7f5 feat: Add allowing retrieving all fields for an item
With 1Password, there is a way to retrieve all fields
of a given item directly without having to enumerate them.

Allowing this when passing no arguments for secrets fetch
command.
2025-05-22 22:21:51 -07:00
Donal McBreen
344e2d7995 Merge pull request #1564 from basecamp/add-kamal-host-var-mop-up
KAMAL_HOST env var mop-up
2025-05-20 14:32:57 +01:00
Donal McBreen
b387df0e4f KAMAL_HOST env var mop-up
- Ensure tests pass
- Switch from -e to --env everywhere
- Check KAMAL_HOST env var in integration tests
2025-05-20 14:10:50 +01:00
Donal McBreen
9c8a44eec4 Merge pull request #1471 from jakeprem/jakeprem/add-kamal-host-var
feat: Add KAMAL_HOST to app and accessory containers
2025-05-20 13:48:35 +01:00
Dainel Vera
99f763d742 Merge branch 'main' into feat/custom-ssl 2025-05-19 15:38:33 -06:00
Nick Hammond
4bd1f0536c Merge branch 'basecamp:main' into buildpacks 2025-05-16 15:21:49 -07:00
Donal McBreen
e217332cde Merge pull request #1561 from basecamp/drop-ruby-3.1
Drop Ruby 3.1 from the test matrix
2025-05-15 16:20:32 +01:00
Donal McBreen
30d630ce4d Drop Ruby 3.1 from the test matrix
It is EOL since 2025-03-26.
2025-05-15 15:21:13 +01:00
Andrew Kelley
1331e7b9c7 Added path_prefix and strip_path_prefix 2025-05-13 19:31:54 -04:00
Nick Hammond
c5e5f5d7cc Merge branch 'basecamp:main' into buildpacks 2025-05-13 09:34:13 -07:00
Keita Urashima
6a573c19a6 Fix Bitwarden Secrets Manager authentication checks 2025-05-13 20:33:46 +09:00
Nick Hammond
0ab0649d07 Merge branch 'basecamp:main' into buildpacks 2025-05-10 12:54:29 -07:00
Peter Rullmann
d62c35e63e Add UT for new interactive behaviour
also adding helpers to simulate STDIN being tty or file
2025-05-08 20:24:50 +02:00
dependabot[bot]
9a14fbb048 Bump rack-session in the bundler group across 1 directory
Bumps the bundler group with 1 update in the / directory: [rack-session](https://github.com/rack/rack-session).


Updates `rack-session` from 2.0.0 to 2.1.1
- [Release notes](https://github.com/rack/rack-session/releases)
- [Changelog](https://github.com/rack/rack-session/blob/v2.1.1/releases.md)
- [Commits](https://github.com/rack/rack-session/compare/v2.0.0...v2.1.1)

---
updated-dependencies:
- dependency-name: rack-session
  dependency-version: 2.1.1
  dependency-type: indirect
  dependency-group: bundler
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-08 14:47:52 +00:00
Peter Rullmann
092ca425d7 Allow piping into kamal exec #1485 2025-05-08 12:41:44 +02:00
Nick Hammond
68404e2673 Merge branch 'basecamp:main' into buildpacks 2025-05-08 02:34:40 -07:00
polarctos
681439f122 Install docker-cli only for leaner image
As only the docker client is executed in the image and not the docker daemon, only the docker-cli package is needed
2025-05-07 13:43:27 +02:00
Michael G. Noll
a1c6ac41d0 Fix: correctly parse git remote origin urls for calling Octokit 2025-05-06 09:24:09 +02:00
acidtib
9219b87630 remove chown for TLS certificates in proxy container 2025-04-29 19:57:41 -06:00
acidtib
1f847299c0 improve custom SSL certificate documentation 2025-04-28 13:33:03 -06:00
acidtib
a525d45b4d allow defining certificates directly within ssl hash instead of at the proxy root level 2025-04-28 00:34:24 -06:00
acidtib
045410368d add support for custom certificates 2025-04-26 01:03:15 -06:00
Nick Hammond
045da87219 Merge branch 'basecamp:main' into buildpacks 2025-04-21 09:33:40 -07:00
Nick Hammond
fc67cdea33 Merge branch 'basecamp:main' into buildpacks 2025-04-18 07:47:38 -07:00
Nick Hammond
38cfc4488b Merge branch 'basecamp:main' into buildpacks 2025-03-28 11:47:43 -07:00
Jake Prem
0e453a02de Add KAMAL_HOST to app and accessory containers
Adds the host the container is being deployed to as KAMAL_HOST.
My use case is to more easily tag the host for metrics tagging,
but there might be other uses as well.
2025-03-25 22:49:00 -04:00
acidtib
aa12dc1d12 remove unnecessary blank lines 2025-02-21 17:52:17 -07:00
acidtib
8acd35c4b7 test: add fetch functionality for nested folders and secrets 2025-02-21 17:04:46 -07:00
acidtib
104914bf14 refactor: improve retrieval logic for nested folders 2025-02-21 17:04:04 -07:00
acidtib
913f07bbf2 add PassboltAdapter tests 2025-02-21 00:34:10 -07:00
acidtib
9b63ad5cb8 feat: add Passbolt adapter 2025-02-20 22:38:07 -07:00
Nick Hammond
8c17b1ebc6 Add export_action support for pack 2025-02-07 13:07:33 -07:00
Nick Hammond
f8f7c6ec57 Catch up with 2.5.1 2025-02-06 07:12:05 -07:00
Nick Hammond
da26457d52 Merge branch 'buildpacks' of github.com:nickhammond/kamal into buildpacks 2025-01-20 09:53:40 -07:00
Nick Hammond
95b606a427 Catch up with main 2025-01-20 09:53:16 -07:00
Nick Hammond
d249b9a431 Merge branch 'basecamp:main' into buildpacks 2025-01-05 15:31:24 -07:00
Nick Hammond
9f6660dfbf Catch up with main 2024-11-26 07:36:54 -07:00
Nick Hammond
9ac3d57b29 Add default creation time to now for image 2024-10-30 06:25:12 -07:00
Nick Hammond
8354fbee06 Merge branch 'buildpacks' of github.com:nickhammond/kamal into buildpacks 2024-10-28 08:26:55 -07:00
Nick Hammond
cde5c7abbf Catch up with main 2024-10-28 08:26:40 -07:00
Nick Hammond
1ebc8b8daa Merge branch 'basecamp:main' into buildpacks 2024-10-17 07:58:35 -07:00
Nick Hammond
145b73c4f0 Add a no-op remove method for pack 2024-10-17 07:54:17 -07:00
Nick Hammond
d538447973 Add validator for buildpack arch 2024-10-17 07:46:45 -07:00
Nick Hammond
4822a9d950 Merge branch 'basecamp:main' into buildpacks 2024-10-14 16:58:16 -07:00
Nick Hammond
1d55c5941b Add in pack builder inspect for configured builder 2024-10-14 16:57:51 -07:00
Nick Hammond
89b44153bb Ensure build args and secrets are used with pack 2024-10-02 09:55:57 -07:00
Nick Hammond
5482052e19 Merge branch 'basecamp:main' into buildpacks 2024-10-02 08:59:41 -07:00
Nick Hammond
dda8efe39a Point to project.toml in docs 2024-10-01 14:08:26 -07:00
Nick Hammond
c60124188f Merge branch 'basecamp:main' into buildpacks 2024-10-01 13:59:22 -07:00
Nick Hammond
f7147e07d4 Merge branch 'basecamp:main' into buildpacks 2024-09-27 18:46:49 -04:00
Nick Hammond
71741742ff Merge branch 'basecamp:main' into buildpacks 2024-09-27 00:19:45 -04:00
Nick Hammond
e252004eef Use argumentize for secrets with pack 2024-09-23 20:16:06 -07:00
Nick Hammond
85a5a09aac Merge branch 'basecamp:main' into buildpacks 2024-09-22 08:47:08 -07:00
Nick Hammond
548452aa12 Merge branch 'basecamp:main' into buildpacks 2024-09-16 18:11:33 -07:00
Nick Hammond
2c5f2a7ce0 Don't need to inspect the builder if pack 2024-09-05 22:25:50 -07:00
Nick Hammond
ae68193f99 pack arch no longer needed, update builder name in tests 2024-09-05 22:17:28 -07:00
Nick Hammond
24f4308372 Catch up with main 2024-09-05 21:55:11 -07:00
Nick Hammond
d0ffb850da Utilize repository name for pack name 2024-09-04 09:42:40 -07:00
Nick Hammond
826308aabd Clean things up via Rubocop 2024-08-27 22:52:06 -07:00
Nick Hammond
897b3b4e46 Add a pack option to the builder options 2024-08-27 22:25:56 -07:00
57 changed files with 1465 additions and 137 deletions

View File

@@ -26,16 +26,12 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
ruby-version: ruby-version:
- "3.1"
- "3.2" - "3.2"
- "3.3" - "3.3"
- "3.4" - "3.4"
gemfile: gemfile:
- Gemfile - Gemfile
- gemfiles/rails_edge.gemfile - gemfiles/rails_edge.gemfile
exclude:
- ruby-version: "3.1"
gemfile: gemfiles/rails_edge.gemfile
name: ${{ format('Tests (Ruby {0})', matrix.ruby-version) }} name: ${{ format('Tests (Ruby {0})', matrix.ruby-version) }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
env: env:

View File

@@ -13,8 +13,7 @@ COPY Gemfile Gemfile.lock kamal.gemspec ./
COPY lib/kamal/version.rb /kamal/lib/kamal/version.rb COPY lib/kamal/version.rb /kamal/lib/kamal/version.rb
# Install system dependencies # Install system dependencies
RUN apk add --no-cache build-base git docker openrc openssh-client-default yaml-dev \ RUN apk add --no-cache build-base git docker-cli openssh-client-default yaml-dev \
&& rc-update add docker boot \
&& gem install bundler --version=2.6.5 \ && gem install bundler --version=2.6.5 \
&& bundle install && bundle install

View File

@@ -1,7 +1,7 @@
PATH PATH
remote: . remote: .
specs: specs:
kamal (2.6.1) kamal (2.7.0)
activesupport (>= 7.0) activesupport (>= 7.0)
base64 (~> 0.2) base64 (~> 0.2)
bcrypt_pbkdf (~> 1.0) bcrypt_pbkdf (~> 1.0)
@@ -101,8 +101,9 @@ GEM
date date
stringio stringio
racc (1.8.1) racc (1.8.1)
rack (3.1.12) rack (3.1.16)
rack-session (2.0.0) rack-session (2.1.1)
base64 (>= 0.1.0)
rack (>= 3.0.0) rack (>= 3.0.0)
rack-test (2.1.0) rack-test (2.1.0)
rack (>= 1.3) rack (>= 1.3)

View File

@@ -24,11 +24,11 @@ class Kamal::Cli::Accessory < Kamal::Cli::Base
directories(name) directories(name)
upload(name) upload(name)
on(hosts) do on(hosts) do |host|
execute *KAMAL.auditor.record("Booted #{name} accessory"), verbosity: :debug execute *KAMAL.auditor.record("Booted #{name} accessory"), verbosity: :debug
execute *accessory.ensure_env_directory execute *accessory.ensure_env_directory
upload! accessory.secrets_io, accessory.secrets_path, mode: "0600" upload! accessory.secrets_io, accessory.secrets_path, mode: "0600"
execute *accessory.run execute *accessory.run(host: host)
if accessory.running_proxy? if accessory.running_proxy?
target = capture_with_info(*accessory.container_id_for(container_name: accessory.service_name, only_running: true)).strip target = capture_with_info(*accessory.container_id_for(container_name: accessory.service_name, only_running: true)).strip

View File

@@ -12,6 +12,7 @@ class Kamal::Cli::App < Kamal::Cli::Base
KAMAL.roles_on(host).each do |role| KAMAL.roles_on(host).each do |role|
Kamal::Cli::App::Assets.new(host, role, self).run Kamal::Cli::App::Assets.new(host, role, self).run
Kamal::Cli::App::SslCertificates.new(host, role, self).run
end end
end end

View File

@@ -0,0 +1,28 @@
class Kamal::Cli::App::SslCertificates
attr_reader :host, :role, :sshkit
delegate :execute, :info, :upload!, to: :sshkit
def initialize(host, role, sshkit)
@host = host
@role = role
@sshkit = sshkit
end
def run
if role.running_proxy? && role.proxy.custom_ssl_certificate?
info "Writing SSL certificates for #{role.name} on #{host}"
execute *app.create_ssl_directory
if cert_content = role.proxy.certificate_pem_content
upload!(StringIO.new(cert_content), role.proxy.host_tls_cert, mode: "0644")
end
if key_content = role.proxy.private_key_pem_content
upload!(StringIO.new(key_content), role.proxy.host_tls_key, mode: "0644")
end
end
end
private
def app
@app ||= KAMAL.app(role: role, host: host)
end
end

View File

@@ -43,7 +43,7 @@ class GithubStatusChecks
attr_reader :remote_url, :git_sha, :github_client, :combined_status attr_reader :remote_url, :git_sha, :github_client, :combined_status
def initialize def initialize
@remote_url = `git config --get remote.origin.url`.strip.delete_prefix("https://github.com/") @remote_url = github_repo_from_remote_url
@git_sha = `git rev-parse HEAD`.strip @git_sha = `git rev-parse HEAD`.strip
@github_client = Octokit::Client.new(access_token: ENV["GITHUB_TOKEN"]) @github_client = Octokit::Client.new(access_token: ENV["GITHUB_TOKEN"])
refresh! refresh!
@@ -77,6 +77,18 @@ class GithubStatusChecks
"Build not started..." "Build not started..."
end end
end end
private
def github_repo_from_remote_url
url = `git config --get remote.origin.url`.strip.delete_suffix(".git")
if url.start_with?("https://github.com/")
url.delete_prefix("https://github.com/")
elsif url.start_with?("git@github.com:")
url.delete_prefix("git@github.com:")
else
url
end
end
end end

View File

@@ -12,7 +12,7 @@ class Kamal::Commands::Accessory < Kamal::Commands::Base
@accessory_config = config.accessory(name) @accessory_config = config.accessory(name)
end end
def run def run(host: nil)
docker :run, docker :run,
"--name", service_name, "--name", service_name,
"--detach", "--detach",
@@ -20,6 +20,7 @@ class Kamal::Commands::Accessory < Kamal::Commands::Base
*network_args, *network_args,
*config.logging_args, *config.logging_args,
*publish_args, *publish_args,
*([ "--env", "KAMAL_HOST=\"#{host}\"" ] if host),
*env_args, *env_args,
*volume_args, *volume_args,
*label_args, *label_args,
@@ -55,14 +56,14 @@ class Kamal::Commands::Accessory < Kamal::Commands::Base
def execute_in_existing_container(*command, interactive: false) def execute_in_existing_container(*command, interactive: false)
docker :exec, docker :exec,
("-it" if interactive), (docker_interactive_args if interactive),
service_name, service_name,
*command *command
end end
def execute_in_new_container(*command, interactive: false) def execute_in_new_container(*command, interactive: false)
docker :run, docker :run,
("-it" if interactive), (docker_interactive_args if interactive),
"--rm", "--rm",
*network_args, *network_args,
*env_args, *env_args,

View File

@@ -20,8 +20,9 @@ class Kamal::Commands::App < Kamal::Commands::Base
"--name", container_name, "--name", container_name,
"--network", "kamal", "--network", "kamal",
*([ "--hostname", hostname ] if hostname), *([ "--hostname", hostname ] if hostname),
"-e", "KAMAL_CONTAINER_NAME=\"#{container_name}\"", "--env", "KAMAL_CONTAINER_NAME=\"#{container_name}\"",
"-e", "KAMAL_VERSION=\"#{config.version}\"", "--env", "KAMAL_VERSION=\"#{config.version}\"",
"--env", "KAMAL_HOST=\"#{host}\"",
*role.env_args(host), *role.env_args(host),
*role.logging_args, *role.logging_args,
*config.volume_args, *config.volume_args,

View File

@@ -1,7 +1,7 @@
module Kamal::Commands::App::Execution module Kamal::Commands::App::Execution
def execute_in_existing_container(*command, interactive: false, env:) def execute_in_existing_container(*command, interactive: false, env:)
docker :exec, docker :exec,
("-it" if interactive), (docker_interactive_args if interactive),
*argumentize("--env", env), *argumentize("--env", env),
container_name, container_name,
*command *command
@@ -9,7 +9,7 @@ module Kamal::Commands::App::Execution
def execute_in_new_container(*command, interactive: false, detach: false, env:) def execute_in_new_container(*command, interactive: false, detach: false, env:)
docker :run, docker :run,
("-it" if interactive), (docker_interactive_args if interactive),
("--detach" if detach), ("--detach" if detach),
("--rm" unless detach), ("--rm" unless detach),
"--network", "kamal", "--network", "kamal",

View File

@@ -21,6 +21,10 @@ module Kamal::Commands::App::Proxy
remove_directory config.proxy_boot.app_directory remove_directory config.proxy_boot.app_directory
end end
def create_ssl_directory
make_directory(File.join(config.proxy_boot.tls_directory, role.name))
end
private private
def proxy_exec(*command) def proxy_exec(*command)
docker :exec, proxy_container_name, "kamal-proxy", *command docker :exec, proxy_container_name, "kamal-proxy", *command

View File

@@ -84,6 +84,10 @@ module Kamal::Commands
args.compact.unshift :docker args.compact.unshift :docker
end end
def pack(*args)
args.compact.unshift :pack
end
def git(*args, path: nil) def git(*args, path: nil)
[ :git, *([ "-C", path ] if path), *args.compact ] [ :git, *([ "-C", path ] if path), *args.compact ]
end end
@@ -122,5 +126,9 @@ module Kamal::Commands
def ensure_local_buildx_installed def ensure_local_buildx_installed
docker :buildx, "version" docker :buildx, "version"
end end
def docker_interactive_args
STDIN.isatty ? "-it" : "-i"
end
end end
end end

View File

@@ -2,7 +2,7 @@ require "active_support/core_ext/string/filters"
class Kamal::Commands::Builder < Kamal::Commands::Base class Kamal::Commands::Builder < Kamal::Commands::Base
delegate :create, :remove, :dev, :push, :clean, :pull, :info, :inspect_builder, :validate_image, :first_mirror, to: :target delegate :create, :remove, :dev, :push, :clean, :pull, :info, :inspect_builder, :validate_image, :first_mirror, to: :target
delegate :local?, :remote?, :cloud?, to: "config.builder" delegate :local?, :remote?, :pack?, :cloud?, to: "config.builder"
include Clone include Clone
@@ -17,6 +17,8 @@ class Kamal::Commands::Builder < Kamal::Commands::Base
else else
remote remote
end end
elsif pack?
pack
elsif cloud? elsif cloud?
cloud cloud
else else
@@ -36,6 +38,10 @@ class Kamal::Commands::Builder < Kamal::Commands::Base
@hybrid ||= Kamal::Commands::Builder::Hybrid.new(config) @hybrid ||= Kamal::Commands::Builder::Hybrid.new(config)
end end
def pack
@pack ||= Kamal::Commands::Builder::Pack.new(config)
end
def cloud def cloud
@cloud ||= Kamal::Commands::Builder::Cloud.new(config) @cloud ||= Kamal::Commands::Builder::Cloud.new(config)
end end

View File

@@ -6,6 +6,7 @@ class Kamal::Commands::Builder::Base < Kamal::Commands::Base
delegate :argumentize, to: Kamal::Utils delegate :argumentize, to: Kamal::Utils
delegate \ delegate \
:args, :secrets, :dockerfile, :target, :arches, :local_arches, :remote_arches, :remote, :args, :secrets, :dockerfile, :target, :arches, :local_arches, :remote_arches, :remote,
:pack?, :pack_builder, :pack_buildpacks,
:cache_from, :cache_to, :ssh, :provenance, :sbom, :driver, :docker_driver?, :cache_from, :cache_to, :ssh, :provenance, :sbom, :driver, :docker_driver?,
to: :builder_config to: :builder_config

View File

@@ -0,0 +1,46 @@
class Kamal::Commands::Builder::Pack < Kamal::Commands::Builder::Base
def push(export_action = "registry")
combine \
build,
export(export_action)
end
def remove;end
def info
pack :builder, :inspect, pack_builder
end
alias_method :inspect_builder, :info
private
def build
pack(:build,
config.repository,
"--platform", platform,
"--creation-time", "now",
"--builder", pack_builder,
buildpacks,
"-t", config.absolute_image,
"-t", config.latest_image,
"--env", "BP_IMAGE_LABELS=service=#{config.service}",
*argumentize("--env", args),
*argumentize("--env", secrets, sensitive: true),
"--path", build_context)
end
def export(export_action)
return unless export_action == "registry"
combine \
docker(:push, config.absolute_image),
docker(:push, config.latest_image)
end
def platform
"linux/#{local_arches.first}"
end
def buildpacks
(pack_buildpacks << "paketo-buildpacks/image-labels").map { |buildpack| [ "--buildpack", buildpack ] }
end
end

View File

@@ -63,7 +63,7 @@ class Kamal::Configuration
@env = Env.new(config: @raw_config.env || {}, secrets: secrets) @env = Env.new(config: @raw_config.env || {}, secrets: secrets)
@logging = Logging.new(logging_config: @raw_config.logging) @logging = Logging.new(logging_config: @raw_config.logging)
@proxy = Proxy.new(config: self, proxy_config: @raw_config.proxy) @proxy = Proxy.new(config: self, proxy_config: @raw_config.proxy, secrets: secrets)
@proxy_boot = Proxy::Boot.new(config: self) @proxy_boot = Proxy::Boot.new(config: self)
@ssh = Ssh.new(config: self) @ssh = Ssh.new(config: self)
@sshkit = Sshkit.new(config: self) @sshkit = Sshkit.new(config: self)

View File

@@ -125,7 +125,8 @@ class Kamal::Configuration::Accessory
Kamal::Configuration::Proxy.new \ Kamal::Configuration::Proxy.new \
config: config, config: config,
proxy_config: accessory_config["proxy"], proxy_config: accessory_config["proxy"],
context: "accessories/#{name}/proxy" context: "accessories/#{name}/proxy",
secrets: config.secrets
end end
def initialize_registry def initialize_registry

View File

@@ -61,6 +61,10 @@ class Kamal::Configuration::Builder
!!builder_config["cache"] !!builder_config["cache"]
end end
def pack?
!!builder_config["pack"]
end
def args def args
builder_config["args"] || {} builder_config["args"] || {}
end end
@@ -85,6 +89,14 @@ class Kamal::Configuration::Builder
builder_config.fetch("driver", "docker-container") builder_config.fetch("driver", "docker-container")
end end
def pack_builder
builder_config["pack"]["builder"] if pack?
end
def pack_buildpacks
builder_config["pack"]["buildpacks"] if pack?
end
def local_disabled? def local_disabled?
builder_config["local"] == false builder_config["local"] == false
end end

View File

@@ -31,6 +31,19 @@ builder:
# Defaults to true: # Defaults to true:
local: true local: true
# Buildpack configuration
#
# The build configuration for using pack to build a Cloud Native Buildpack image.
#
# For additional buildpack customization options you can create a project descriptor
# file(project.toml) that the Pack CLI will automatically use.
# See https://buildpacks.io/docs/for-app-developers/how-to/build-inputs/use-project-toml/ for more information.
pack:
builder: heroku/builder:24
buildpacks:
- heroku/ruby
- heroku/procfile
# Builder cache # Builder cache
# #
# The type must be either 'gha' or 'registry'. # The type must be either 'gha' or 'registry'.

View File

@@ -45,7 +45,27 @@ proxy:
# unless you explicitly set `forward_headers: true` # unless you explicitly set `forward_headers: true`
# #
# Defaults to `false`: # Defaults to `false`:
ssl: true ssl: ...
# Custom SSL certificate
#
# In some cases, using Let's Encrypt for automatic certificate management is not an
# option, for example if you are running from host than one host. Or you may already
# have SSL certificates issued by a different Certificate Authority (CA).
# Kamal supports loading custom SSL certificates
# directly from secrets.
#
# Examples:
# ssl: true # Enable SSL with Let's Encrypt
# ssl: false # Disable SSL
# ssl: # Enable custom SSL
# certificate_pem: CERTIFICATE_PEM
# private_key_pem: PRIVATE_KEY_PEM
#
# ### Notes
# - If the certificate or key is missing or invalid, kamal-proxy will fail to start.
# - Always handle SSL certificates and private keys securely. Avoid hard-coding them in deploy.yml files or source control.
# - For automated certificate management, consider using the built-in Let's Encrypt integration instead.
# SSL redirect # SSL redirect
# #
@@ -69,6 +89,17 @@ proxy:
# How long to wait for requests to complete before timing out, defaults to 30 seconds: # How long to wait for requests to complete before timing out, defaults to 30 seconds:
response_timeout: 10 response_timeout: 10
# Path-based routing
#
# For applications that split their traffic to different services based on the request path,
# you can use path-based routing to mount services under different path prefixes.
path_prefix: '/api'
# By default, the path prefix will be stripped from the request before it is forwarded upstream.
# So in the example above, a request to /api/users/123 will be forwarded to web-1 as /users/123.
# To instead forward the request with the original path (including the prefix),
# specify --strip-path-prefix=false
strip_path_prefix: false
# Healthcheck # Healthcheck
# #
# When deploying, the proxy will by default hit `/up` once every second until we hit # When deploying, the proxy will by default hit `/up` once every second until we hit

View File

@@ -6,12 +6,14 @@ class Kamal::Configuration::Proxy
delegate :argumentize, :optionize, to: Kamal::Utils delegate :argumentize, :optionize, to: Kamal::Utils
attr_reader :config, :proxy_config attr_reader :config, :proxy_config, :role_name, :secrets
def initialize(config:, proxy_config:, context: "proxy") def initialize(config:, proxy_config:, role_name: nil, secrets:, context: "proxy")
@config = config @config = config
@proxy_config = proxy_config @proxy_config = proxy_config
@proxy_config = {} if @proxy_config.nil? @proxy_config = {} if @proxy_config.nil?
@role_name = role_name
@secrets = secrets
validate! @proxy_config, with: Kamal::Configuration::Validator::Proxy, context: context validate! @proxy_config, with: Kamal::Configuration::Validator::Proxy, context: context
end end
@@ -27,10 +29,46 @@ class Kamal::Configuration::Proxy
proxy_config["hosts"] || proxy_config["host"]&.split(",") || [] proxy_config["hosts"] || proxy_config["host"]&.split(",") || []
end end
def custom_ssl_certificate?
ssl = proxy_config["ssl"]
return false unless ssl.is_a?(Hash)
ssl["certificate_pem"].present? && ssl["private_key_pem"].present?
end
def certificate_pem_content
ssl = proxy_config["ssl"]
return nil unless ssl.is_a?(Hash)
secrets[ssl["certificate_pem"]]
end
def private_key_pem_content
ssl = proxy_config["ssl"]
return nil unless ssl.is_a?(Hash)
secrets[ssl["private_key_pem"]]
end
def host_tls_cert
tls_path(config.proxy_boot.tls_directory, "cert.pem")
end
def host_tls_key
tls_path(config.proxy_boot.tls_directory, "key.pem")
end
def container_tls_cert
tls_path(config.proxy_boot.tls_container_directory, "cert.pem")
end
def container_tls_key
tls_path(config.proxy_boot.tls_container_directory, "key.pem") if custom_ssl_certificate?
end
def deploy_options def deploy_options
{ {
host: hosts, host: hosts,
tls: proxy_config["ssl"].presence, tls: ssl? ? true : nil,
"tls-certificate-path": container_tls_cert,
"tls-private-key-path": container_tls_key,
"deploy-timeout": seconds_duration(config.deploy_timeout), "deploy-timeout": seconds_duration(config.deploy_timeout),
"drain-timeout": seconds_duration(config.drain_timeout), "drain-timeout": seconds_duration(config.drain_timeout),
"health-check-interval": seconds_duration(proxy_config.dig("healthcheck", "interval")), "health-check-interval": seconds_duration(proxy_config.dig("healthcheck", "interval")),
@@ -42,6 +80,8 @@ class Kamal::Configuration::Proxy
"buffer-memory": proxy_config.dig("buffering", "memory"), "buffer-memory": proxy_config.dig("buffering", "memory"),
"max-request-body": proxy_config.dig("buffering", "max_request_body"), "max-request-body": proxy_config.dig("buffering", "max_request_body"),
"max-response-body": proxy_config.dig("buffering", "max_response_body"), "max-response-body": proxy_config.dig("buffering", "max_response_body"),
"path-prefix": proxy_config.dig("path_prefix"),
"strip-path-prefix": proxy_config.dig("strip_path_prefix"),
"forward-headers": proxy_config.dig("forward_headers"), "forward-headers": proxy_config.dig("forward_headers"),
"tls-redirect": proxy_config.dig("ssl_redirect"), "tls-redirect": proxy_config.dig("ssl_redirect"),
"log-request-header": proxy_config.dig("logging", "request_headers") || DEFAULT_LOG_REQUEST_HEADERS, "log-request-header": proxy_config.dig("logging", "request_headers") || DEFAULT_LOG_REQUEST_HEADERS,
@@ -66,10 +106,14 @@ class Kamal::Configuration::Proxy
end end
def merge(other) def merge(other)
self.class.new config: config, proxy_config: proxy_config.deep_merge(other.proxy_config) self.class.new config: config, proxy_config: other.proxy_config.deep_merge(proxy_config), role_name: role_name, secrets: secrets
end end
private private
def tls_path(directory, filename)
File.join([ directory, role_name, filename ].compact) if custom_ssl_certificate?
end
def seconds_duration(value) def seconds_duration(value)
value ? "#{value}s" : nil value ? "#{value}s" : nil
end end

View File

@@ -100,6 +100,14 @@ class Kamal::Configuration::Proxy::Boot
File.join app_container_directory, "error_pages" File.join app_container_directory, "error_pages"
end end
def tls_directory
File.join app_directory, "tls"
end
def tls_container_directory
File.join app_container_directory, "tls"
end
private private
def ensure_valid_bind_ips(bind_ips) def ensure_valid_bind_ips(bind_ips)
bind_ips.present? && bind_ips.each do |ip| bind_ips.present? && bind_ips.each do |ip|

View File

@@ -68,7 +68,7 @@ class Kamal::Configuration::Role
end end
def proxy def proxy
@proxy ||= config.proxy.merge(specialized_proxy) if running_proxy? @proxy ||= specialized_proxy.merge(config.proxy) if running_proxy?
end end
def running_proxy? def running_proxy?
@@ -150,8 +150,8 @@ class Kamal::Configuration::Role
end end
def ensure_one_host_for_ssl def ensure_one_host_for_ssl
if running_proxy? && proxy.ssl? && hosts.size > 1 if running_proxy? && proxy.ssl? && hosts.size > 1 && !proxy.custom_ssl_certificate?
raise Kamal::ConfigurationError, "SSL is only supported on a single server, found #{hosts.size} servers for role #{name}" raise Kamal::ConfigurationError, "SSL is only supported on a single server unless you provide custom certificates, found #{hosts.size} servers for role #{name}"
end end
end end
@@ -173,6 +173,8 @@ class Kamal::Configuration::Role
@specialized_proxy = Kamal::Configuration::Proxy.new \ @specialized_proxy = Kamal::Configuration::Proxy.new \
config: config, config: config,
proxy_config: proxy_config, proxy_config: proxy_config,
secrets: config.secrets,
role_name: name,
context: "servers/#{name}/proxy" context: "servers/#{name}/proxy"
end end
end end

View File

@@ -24,7 +24,9 @@ class Kamal::Configuration::Validator
example_value = example[key] example_value = example[key]
if example_value == "..." if example_value == "..."
unless key.to_s == "proxy" && boolean?(value.class) if key.to_s == "ssl"
validate_type! value, TrueClass, FalseClass, Hash
elsif key.to_s != "proxy" || !boolean?(value.class)
validate_type! value, *(Array if key == :servers), Hash validate_type! value, *(Array if key == :servers), Hash
end end
elsif key == "hosts" elsif key == "hosts"
@@ -169,6 +171,18 @@ class Kamal::Configuration::Validator
unknown_keys_error unknown_keys if unknown_keys.present? unknown_keys_error unknown_keys if unknown_keys.present?
end end
def validate_labels!(labels)
return true if labels.blank?
with_context("labels") do
labels.each do |key, _|
with_context(key) do
error "invalid label. destination, role, and service are reserved labels" if %w[destination role service].include?(key)
end
end
end
end
def validate_docker_options!(options) def validate_docker_options!(options)
if options if options
error "Cannot set restart policy in docker options, unless-stopped is required" if options["restart"] error "Cannot set restart policy in docker options, unless-stopped is required" if options["restart"]

View File

@@ -6,6 +6,8 @@ class Kamal::Configuration::Validator::Accessory < Kamal::Configuration::Validat
error "specify one of `host`, `hosts`, `role`, `roles`, `tag` or `tags`" error "specify one of `host`, `hosts`, `role`, `roles`, `tag` or `tags`"
end end
validate_labels!(config["labels"])
validate_docker_options!(config["options"]) validate_docker_options!(config["options"])
end end
end end

View File

@@ -8,6 +8,8 @@ class Kamal::Configuration::Validator::Builder < Kamal::Configuration::Validator
error "Builder arch not set" unless config["arch"].present? error "Builder arch not set" unless config["arch"].present?
error "buildpacks only support building for one arch" if config["pack"] && config["arch"].is_a?(Array) && config["arch"].size > 1
error "Cannot disable local builds, no remote is set" if config["local"] == false && config["remote"].blank? error "Cannot disable local builds, no remote is set" if config["local"] == false && config["remote"].blank?
end end
end end

View File

@@ -10,6 +10,16 @@ class Kamal::Configuration::Validator::Proxy < Kamal::Configuration::Validator
if (config.keys & [ "host", "hosts" ]).size > 1 if (config.keys & [ "host", "hosts" ]).size > 1
error "Specify one of 'host' or 'hosts', not both" error "Specify one of 'host' or 'hosts', not both"
end end
if config["ssl"].is_a?(Hash)
if config["ssl"]["certificate_pem"].present? && config["ssl"]["private_key_pem"].blank?
error "Missing private_key_pem setting (required when certificate_pem is present)"
end
if config["ssl"]["private_key_pem"].present? && config["ssl"]["certificate_pem"].blank?
error "Missing certificate_pem setting (required when private_key_pem is present)"
end
end
end end
end end
end end

View File

@@ -6,6 +6,7 @@ class Kamal::Configuration::Validator::Role < Kamal::Configuration::Validator
validate_servers!(config) validate_servers!(config)
else else
super super
validate_labels!(config["labels"])
validate_docker_options!(config["options"]) validate_docker_options!(config["options"])
end end
end end

View File

@@ -6,8 +6,8 @@ class Kamal::Secrets::Adapters::BitwardenSecretsManager < Kamal::Secrets::Adapte
private private
LIST_ALL_SELECTOR = "all" LIST_ALL_SELECTOR = "all"
LIST_ALL_FROM_PROJECT_SUFFIX = "/all" LIST_ALL_FROM_PROJECT_SUFFIX = "/all"
LIST_COMMAND = "secret list -o env" LIST_COMMAND = "secret list"
GET_COMMAND = "secret get -o env" GET_COMMAND = "secret get"
def fetch_secrets(secrets, from:, account:, session:) def fetch_secrets(secrets, from:, account:, session:)
raise RuntimeError, "You must specify what to retrieve from Bitwarden Secrets Manager" if secrets.length == 0 raise RuntimeError, "You must specify what to retrieve from Bitwarden Secrets Manager" if secrets.length == 0
@@ -18,17 +18,17 @@ class Kamal::Secrets::Adapters::BitwardenSecretsManager < Kamal::Secrets::Adapte
{}.tap do |results| {}.tap do |results|
if command.nil? if command.nil?
secrets.each do |secret_uuid| secrets.each do |secret_uuid|
secret = run_command("#{GET_COMMAND} #{secret_uuid.shellescape}") item_json = run_command("#{GET_COMMAND} #{secret_uuid.shellescape}")
raise RuntimeError, "Could not read #{secret_uuid} from Bitwarden Secrets Manager" unless $?.success? raise RuntimeError, "Could not read #{secret_uuid} from Bitwarden Secrets Manager" unless $?.success?
key, value = parse_secret(secret) item_json = JSON.parse(item_json)
results[key] = value results[item_json["key"]] = item_json["value"]
end end
else else
secrets = run_command(command) items_json = run_command(command)
raise RuntimeError, "Could not read secrets from Bitwarden Secrets Manager" unless $?.success? raise RuntimeError, "Could not read secrets from Bitwarden Secrets Manager" unless $?.success?
secrets.split("\n").each do |secret|
key, value = parse_secret(secret) JSON.parse(items_json).each do |item_json|
results[key] = value results[item_json["key"]] = item_json["value"]
end end
end end
end end
@@ -45,19 +45,13 @@ class Kamal::Secrets::Adapters::BitwardenSecretsManager < Kamal::Secrets::Adapte
end end
end end
def parse_secret(secret)
key, value = secret.split("=", 2)
value = value.gsub(/^"|"$/, "")
[ key, value ]
end
def run_command(command, session: nil) def run_command(command, session: nil)
full_command = [ "bws", command ].join(" ") full_command = [ "bws", command ].join(" ")
`#{full_command}` `#{full_command}`
end end
def login(account) def login(account)
run_command("run 'echo OK'") run_command("project list")
raise RuntimeError, "Could not authenticate to Bitwarden Secrets Manager. Did you set a valid access token?" unless $?.success? raise RuntimeError, "Could not authenticate to Bitwarden Secrets Manager. Did you set a valid access token?" unless $?.success?
end end

View File

@@ -16,20 +16,36 @@ class Kamal::Secrets::Adapters::OnePassword < Kamal::Secrets::Adapters::Base
end end
def fetch_secrets(secrets, from:, account:, session:) def fetch_secrets(secrets, from:, account:, session:)
if secrets.blank?
fetch_all_secrets(from: from, account: account, session: session) if secrets.blank?
else
fetch_specified_secrets(secrets, from: from, account: account, session: session)
end
end
def fetch_specified_secrets(secrets, from:, account:, session:)
{}.tap do |results| {}.tap do |results|
vaults_items_fields(prefixed_secrets(secrets, from: from)).map do |vault, items| vaults_items_fields(prefixed_secrets(secrets, from: from)).map do |vault, items|
items.each do |item, fields| items.each do |item, fields|
fields_json = JSON.parse(op_item_get(vault, item, fields, account: account, session: session)) fields_json = JSON.parse(op_item_get(vault, item, fields: fields, account: account, session: session))
fields_json = [ fields_json ] if fields.one? fields_json = [ fields_json ] if fields.one?
fields_json.each do |field_json| results.merge!(fields_map(fields_json))
# The reference is in the form `op://vault/item/field[/field]`
field = field_json["reference"].delete_prefix("op://").delete_suffix("/password")
results[field] = field_json["value"]
end end
end end
end end
end end
def fetch_all_secrets(from:, account:, session:)
{}.tap do |results|
vault_items(from).each do |vault, items|
items.each do |item|
fields_json = JSON.parse(op_item_get(vault, item, account: account, session: session)).fetch("fields")
results.merge!(fields_map(fields_json))
end
end
end
end end
def to_options(**options) def to_options(**options)
@@ -50,12 +66,30 @@ class Kamal::Secrets::Adapters::OnePassword < Kamal::Secrets::Adapters::Base
end end
end end
def op_item_get(vault, item, fields, account:, session:) def vault_items(from)
labels = fields.map { |field| "label=#{field}" }.join(",") from = from.delete_prefix("op://")
options = to_options(vault: vault, fields: labels, format: "json", account: account, session: session.presence) vault, item = from.split("/")
{ vault => [ item ] }
end
`op item get #{item.shellescape} #{options}`.tap do def fields_map(fields_json)
raise RuntimeError, "Could not read #{fields.join(", ")} from #{item} in the #{vault} 1Password vault" unless $?.success? fields_json.to_h do |field_json|
# The reference is in the form `op://vault/item/field[/field]`
field = field_json["reference"].delete_prefix("op://").delete_suffix("/password")
[ field, field_json["value"] ]
end
end
def op_item_get(vault, item, fields: nil, account:, session:)
options = { vault: vault, format: "json", account: account, session: session.presence }
if fields.present?
labels = fields.map { |field| "label=#{field}" }.join(",")
options.merge!(fields: labels)
end
`op item get #{item.shellescape} #{to_options(**options)}`.tap do
raise RuntimeError, "Could not read #{"#{fields.join(", ")} " if fields.present?}from #{item} in the #{vault} 1Password vault" unless $?.success?
end end
end end

View File

@@ -0,0 +1,130 @@
class Kamal::Secrets::Adapters::Passbolt < Kamal::Secrets::Adapters::Base
def requires_account?
false
end
private
def login(*)
`passbolt verify`
raise RuntimeError, "Failed to login to Passbolt" unless $?.success?
end
def fetch_secrets(secrets, from:, **)
secrets = prefixed_secrets(secrets, from: from)
raise ArgumentError, "No secrets given to fetch" if secrets.empty?
secret_names = secrets.collect { |s| s.split("/").last }
folders = secrets_get_folders(secrets)
# build filter conditions for each secret with its corresponding folder
filter_conditions = []
secrets.each do |secret|
parts = secret.split("/")
secret_name = parts.last
if parts.size > 1
# get the folder path without the secret name
folder_path = parts[0..-2]
# find the most nested folder for this path
current_folder = nil
current_path = []
folder_path.each do |folder_name|
current_path << folder_name
matching_folders = folders.select { |f| get_folder_path(f, folders) == current_path.join("/") }
current_folder = matching_folders.first if matching_folders.any?
end
if current_folder
filter_conditions << "(Name == #{secret_name.shellescape.inspect} && FolderParentID == #{current_folder["id"].shellescape.inspect})"
end
else
# for root level secrets (no folders)
filter_conditions << "Name == #{secret_name.shellescape.inspect}"
end
end
filter_condition = filter_conditions.any? ? "--filter '#{filter_conditions.join(" || ")}'" : ""
items = `passbolt list resources #{filter_condition} #{folders.map { |item| "--folder #{item["id"]}" }.join(" ")} --json`
raise RuntimeError, "Could not read #{secrets} from Passbolt" unless $?.success?
items = JSON.parse(items)
found_names = items.map { |item| item["name"] }
missing_secrets = secret_names - found_names
raise RuntimeError, "Could not find the following secrets in Passbolt: #{missing_secrets.join(", ")}" if missing_secrets.any?
items.to_h { |item| [ item["name"], item["password"] ] }
end
def secrets_get_folders(secrets)
# extract all folder paths (both parent and nested)
folder_paths = secrets
.select { |s| s.include?("/") }
.map { |s| s.split("/")[0..-2] } # get all parts except the secret name
.uniq
return [] if folder_paths.empty?
all_folders = []
# first get all top-level folders
parent_folders = folder_paths.map(&:first).uniq
filter_condition = "--filter '#{parent_folders.map { |name| "Name == #{name.shellescape.inspect}" }.join(" || ")}'"
fetch_folders = `passbolt list folders #{filter_condition} --json`
raise RuntimeError, "Could not read folders from Passbolt" unless $?.success?
parent_folder_items = JSON.parse(fetch_folders)
all_folders.concat(parent_folder_items)
# get nested folders for each parent
folder_paths.each do |path|
next if path.size <= 1 # skip non-nested folders
parent = path[0]
parent_folder = parent_folder_items.find { |f| f["name"] == parent }
next unless parent_folder
# for each nested level, get the folders using the parent's ID
current_parent = parent_folder
path[1..-1].each do |folder_name|
filter_condition = "--filter 'Name == #{folder_name.shellescape.inspect} && FolderParentID == #{current_parent["id"].shellescape.inspect}'"
fetch_nested = `passbolt list folders #{filter_condition} --json`
next unless $?.success?
nested_folders = JSON.parse(fetch_nested)
break if nested_folders.empty?
all_folders.concat(nested_folders)
current_parent = nested_folders.first
end
end
# check if we found all required folders
found_paths = all_folders.map { |f| get_folder_path(f, all_folders) }
missing_paths = folder_paths.map { |path| path.join("/") } - found_paths
raise RuntimeError, "Could not find the following folders in Passbolt: #{missing_paths.join(", ")}" if missing_paths.any?
all_folders
end
def get_folder_path(folder, all_folders, path = [])
path.unshift(folder["name"])
return path.join("/") if folder["folder_parent_id"].to_s.empty?
parent = all_folders.find { |f| f["id"] == folder["folder_parent_id"] }
return path.join("/") unless parent
get_folder_path(parent, all_folders, path)
end
def check_dependencies!
raise RuntimeError, "Passbolt CLI is not installed" unless cli_installed?
end
def cli_installed?
`passbolt --version 2> /dev/null`
$?.success?
end
end

View File

@@ -1,3 +1,3 @@
module Kamal module Kamal
VERSION = "2.6.1" VERSION = "2.7.0"
end end

View File

@@ -15,7 +15,7 @@ class CliAccessoryTest < CliTestCase
run_command("boot", "mysql").tap do |output| run_command("boot", "mysql").tap do |output|
assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.3", output assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.3", output
assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env KAMAL_HOST=\"1.1.1.3\" --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output
end end
end end
@@ -35,10 +35,10 @@ class CliAccessoryTest < CliTestCase
assert_match /docker network create kamal.*on 1.1.1.1/, output assert_match /docker network create kamal.*on 1.1.1.1/, output
assert_match /docker network create kamal.*on 1.1.1.2/, output assert_match /docker network create kamal.*on 1.1.1.2/, output
assert_match /docker network create kamal.*on 1.1.1.3/, output assert_match /docker network create kamal.*on 1.1.1.3/, output
assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env KAMAL_HOST=\"1.1.1.3\" --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.2", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.2\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.2", output
assert_match "docker run --name custom-box --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --env-file .kamal/apps/app/env/accessories/busybox.env --label service=\"custom-box\" other.registry/busybox:latest on 1.1.1.3", output assert_match "docker run --name custom-box --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --env KAMAL_HOST=\"1.1.1.3\" --env-file .kamal/apps/app/env/accessories/busybox.env --label service=\"custom-box\" other.registry/busybox:latest on 1.1.1.3", output
end end
end end
@@ -215,8 +215,8 @@ class CliAccessoryTest < CliTestCase
run_command("boot", "redis", "--hosts", "1.1.1.1").tap do |output| run_command("boot", "redis", "--hosts", "1.1.1.1").tap do |output|
assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.1", output assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.1", output
assert_no_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.2", output assert_no_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.2", output
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output
assert_no_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.2", output assert_no_match /docker run --name app-redis .* on 1.1.1.2/, output
end end
end end
@@ -227,8 +227,8 @@ class CliAccessoryTest < CliTestCase
run_command("boot", "redis", "--hosts", "1.1.1.1,1.1.1.3").tap do |output| run_command("boot", "redis", "--hosts", "1.1.1.1,1.1.1.3").tap do |output|
assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.1", output assert_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.1", output
assert_no_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.3", output assert_no_match "docker login private.registry -u [REDACTED] -p [REDACTED] on 1.1.1.3", output
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output
assert_no_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.3", output assert_no_match /docker run --name app-redis .* on 1.1.1.3/, output
end end
end end
@@ -237,7 +237,7 @@ class CliAccessoryTest < CliTestCase
assert_match "Upgrading all accessories on 1.1.1.3,1.1.1.1,1.1.1.2...", output assert_match "Upgrading all accessories on 1.1.1.3,1.1.1.1,1.1.1.2...", output
assert_match "docker network create kamal on 1.1.1.3", output assert_match "docker network create kamal on 1.1.1.3", output
assert_match "docker container stop app-mysql on 1.1.1.3", output assert_match "docker container stop app-mysql on 1.1.1.3", output
assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env MYSQL_ROOT_HOST="%" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env KAMAL_HOST=\"1.1.1.3\" --env MYSQL_ROOT_HOST="%" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output
assert_match "Upgraded all accessories on 1.1.1.3,1.1.1.1,1.1.1.2...", output assert_match "Upgraded all accessories on 1.1.1.3,1.1.1.1,1.1.1.2...", output
end end
end end
@@ -247,15 +247,15 @@ class CliAccessoryTest < CliTestCase
assert_match "Upgrading all accessories on 1.1.1.3...", output assert_match "Upgrading all accessories on 1.1.1.3...", output
assert_match "docker network create kamal on 1.1.1.3", output assert_match "docker network create kamal on 1.1.1.3", output
assert_match "docker container stop app-mysql on 1.1.1.3", output assert_match "docker container stop app-mysql on 1.1.1.3", output
assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env MYSQL_ROOT_HOST="%" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output assert_match "docker run --name app-mysql --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 3306:3306 --env KAMAL_HOST=\"1.1.1.3\" --env MYSQL_ROOT_HOST="%" --env-file .kamal/apps/app/env/accessories/mysql.env --volume $PWD/app-mysql/etc/mysql/my.cnf:/etc/mysql/my.cnf --volume $PWD/app-mysql/data:/var/lib/mysql --label service=\"app-mysql\" private.registry/mysql:5.7 on 1.1.1.3", output
assert_match "Upgraded all accessories on 1.1.1.3", output assert_match "Upgraded all accessories on 1.1.1.3", output
end end
end end
test "boot with web role filter" do test "boot with web role filter" do
run_command("boot", "redis", "-r", "web").tap do |output| run_command("boot", "redis", "-r", "web").tap do |output|
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.1", output
assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.2", output assert_match "docker run --name app-redis --detach --restart unless-stopped --network kamal --log-opt max-size=\"10m\" --publish 6379:6379 --env KAMAL_HOST=\"1.1.1.2\" --env-file .kamal/apps/app/env/accessories/redis.env --volume $PWD/app-redis/data:/data --label service=\"app-redis\" redis:latest on 1.1.1.2", output
end end
end end

View File

@@ -104,7 +104,7 @@ class CliAppTest < CliTestCase
run_command("boot", config: :with_env_tags).tap do |output| run_command("boot", config: :with_env_tags).tap do |output|
assert_match "docker tag dhh/app:latest dhh/app:latest", output assert_match "docker tag dhh/app:latest dhh/app:latest", output
assert_match %r{docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-[0-9a-f]{12} -e KAMAL_CONTAINER_NAME="app-web-latest" -e KAMAL_VERSION="latest" --env TEST="root" --env EXPERIMENT="disabled" --env SITE="site1"}, output assert_match %r{docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-[0-9a-f]{12} --env KAMAL_CONTAINER_NAME="app-web-latest" --env KAMAL_VERSION="latest" --env KAMAL_HOST="1.1.1.1" --env TEST="root" --env EXPERIMENT="disabled" --env SITE="site1"}, output
assert_match "docker container ls --all --filter name=^app-web-123$ --quiet | xargs docker stop", output assert_match "docker container ls --all --filter name=^app-web-123$ --quiet | xargs docker stop", output
end end
end end
@@ -220,6 +220,21 @@ class CliAppTest < CliTestCase
end end
end end
test "boot with custom ssl certificate" do
Kamal::Configuration::Proxy.any_instance.stubs(:custom_ssl_certificate?).returns(true)
Kamal::Configuration::Proxy.any_instance.stubs(:certificate_pem_content).returns("CERTIFICATE CONTENT")
Kamal::Configuration::Proxy.any_instance.stubs(:private_key_pem_content).returns("PRIVATE KEY CONTENT")
stub_running
run_command("boot", config: :with_proxy).tap do |output|
assert_match "Writing SSL certificates for web on 1.1.1.1", output
assert_match "mkdir -p .kamal/proxy/apps-config/app/tls", output
assert_match "Uploading \"CERTIFICATE CONTENT\" to .kamal/proxy/apps-config/app/tls/web/cert.pem", output
assert_match "--tls-certificate-path=\"/home/kamal-proxy/.apps-config/app/tls/web/cert.pem\"", output
assert_match "--tls-private-key-path=\"/home/kamal-proxy/.apps-config/app/tls/web/key.pem\"", output
end
end
test "start" do test "start" do
SSHKit::Backend::Abstract.any_instance.stubs(:capture_with_info).returns("999") # old version SSHKit::Backend::Abstract.any_instance.stubs(:capture_with_info).returns("999") # old version
@@ -361,6 +376,7 @@ class CliAppTest < CliTestCase
SSHKit::Backend::Abstract.any_instance.expects(:exec) SSHKit::Backend::Abstract.any_instance.expects(:exec)
.with("ssh -t root@1.1.1.1 -p 22 'docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" dhh/app:latest ruby -v'") .with("ssh -t root@1.1.1.1 -p 22 'docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" dhh/app:latest ruby -v'")
stub_stdin_tty do
run_command("exec", "-i", "ruby -v").tap do |output| run_command("exec", "-i", "ruby -v").tap do |output|
assert_hook_ran "pre-connect", output assert_hook_ran "pre-connect", output
assert_match "docker login -u [REDACTED] -p [REDACTED]", output assert_match "docker login -u [REDACTED] -p [REDACTED]", output
@@ -368,12 +384,14 @@ class CliAppTest < CliTestCase
assert_match "Launching interactive command with version latest via SSH from new container on 1.1.1.1...", output assert_match "Launching interactive command with version latest via SSH from new container on 1.1.1.1...", output
end end
end end
end
test "exec interactive with reuse" do test "exec interactive with reuse" do
Kamal::Commands::Hook.any_instance.stubs(:hook_exists?).returns(true) Kamal::Commands::Hook.any_instance.stubs(:hook_exists?).returns(true)
SSHKit::Backend::Abstract.any_instance.expects(:exec) SSHKit::Backend::Abstract.any_instance.expects(:exec)
.with("ssh -t root@1.1.1.1 -p 22 'docker exec -it app-web-999 ruby -v'") .with("ssh -t root@1.1.1.1 -p 22 'docker exec -it app-web-999 ruby -v'")
stub_stdin_tty do
run_command("exec", "-i", "--reuse", "ruby -v").tap do |output| run_command("exec", "-i", "--reuse", "ruby -v").tap do |output|
assert_hook_ran "pre-connect", output assert_hook_ran "pre-connect", output
assert_match "Get current version of running container...", output assert_match "Get current version of running container...", output
@@ -381,6 +399,20 @@ class CliAppTest < CliTestCase
assert_match "Launching interactive command with version 999 via SSH from existing container on 1.1.1.1...", output assert_match "Launching interactive command with version 999 via SSH from existing container on 1.1.1.1...", output
end end
end end
end
test "exec interactive with pipe on STDIN" do
Kamal::Commands::Hook.any_instance.stubs(:hook_exists?).returns(true)
SSHKit::Backend::Abstract.any_instance.expects(:exec)
.with("ssh -t root@1.1.1.1 -p 22 'docker exec -i app-web-999 ruby -v'")
stub_stdin_file do
run_command("exec", "-i", "--reuse", "ruby -v").tap do |output|
assert_hook_ran "pre-connect", output
assert_match "Launching interactive command with version 999 via SSH from existing container on 1.1.1.1...", output
end
end
end
test "containers" do test "containers" do
run_command("containers").tap do |output| run_command("containers").tap do |output|
@@ -474,7 +506,7 @@ class CliAppTest < CliTestCase
run_command("boot", config: :with_proxy).tap do |output| run_command("boot", config: :with_proxy).tap do |output|
assert_match /Renaming container .* to .* as already deployed on 1.1.1.1/, output # Rename assert_match /Renaming container .* to .* as already deployed on 1.1.1.1/, output # Rename
assert_match /docker rename app-web-latest app-web-latest_replaced_[0-9a-f]{16}/, output assert_match /docker rename app-web-latest app-web-latest_replaced_[0-9a-f]{16}/, output
assert_match /docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-[0-9a-f]{12} -e KAMAL_CONTAINER_NAME="app-web-latest" -e KAMAL_VERSION="latest" --env-file .kamal\/apps\/app\/env\/roles\/web.env --log-opt max-size="10m" --label service="app" --label role="web" --label destination dhh\/app:latest/, output assert_match /docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-[0-9a-f]{12} --env KAMAL_CONTAINER_NAME="app-web-latest" --env KAMAL_VERSION="latest" --env KAMAL_HOST="1.1.1.1" --env-file .kamal\/apps\/app\/env\/roles\/web.env --log-opt max-size="10m" --label service="app" --label role="web" --label destination dhh\/app:latest/, output
assert_match /docker exec kamal-proxy kamal-proxy deploy app-web --target="123:80"/, output assert_match /docker exec kamal-proxy kamal-proxy deploy app-web --target="123:80"/, output
assert_match "docker container ls --all --filter name=^app-web-123$ --quiet | xargs docker stop", output assert_match "docker container ls --all --filter name=^app-web-123$ --quiet | xargs docker stop", output
end end

View File

@@ -182,7 +182,7 @@ class CliProxyTest < CliTestCase
assert_match %r{docker rename app-web-latest app-web-latest_replaced_.*}, output assert_match %r{docker rename app-web-latest app-web-latest_replaced_.*}, output
assert_match "/usr/bin/env mkdir -p .kamal/apps/app/env/roles", output assert_match "/usr/bin/env mkdir -p .kamal/apps/app/env/roles", output
assert_match "Uploading \"\\n\" to .kamal/apps/app/env/roles/web.env", output assert_match "Uploading \"\\n\" to .kamal/apps/app/env/roles/web.env", output
assert_match %r{docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-.* -e KAMAL_CONTAINER_NAME="app-web-latest" -e KAMAL_VERSION="latest" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size="10m" --label service="app" --label role="web" --label destination dhh/app:latest}, output assert_match %r{docker run --detach --restart unless-stopped --name app-web-latest --network kamal --hostname 1.1.1.1-.* --env KAMAL_CONTAINER_NAME="app-web-latest" --env KAMAL_VERSION="latest" --env KAMAL_HOST="1.1.1.1" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size="10m" --label service="app" --label role="web" --label destination dhh/app:latest}, output
assert_match "docker exec kamal-proxy kamal-proxy deploy app-web --target=\"12345678:80\" --deploy-timeout=\"6s\" --drain-timeout=\"30s\" --buffer-requests --buffer-responses --log-request-header=\"Cache-Control\" --log-request-header=\"Last-Modified\" --log-request-header=\"User-Agent\"", output assert_match "docker exec kamal-proxy kamal-proxy deploy app-web --target=\"12345678:80\" --deploy-timeout=\"6s\" --drain-timeout=\"30s\" --buffer-requests --buffer-responses --log-request-header=\"Cache-Control\" --log-request-header=\"Last-Modified\" --log-request-header=\"User-Agent\"", output
assert_match "docker container ls --all --filter name=^app-web-12345678$ --quiet | xargs docker stop", output assert_match "docker container ls --all --filter name=^app-web-12345678$ --quiet | xargs docker stop", output
assert_match "docker tag dhh/app:latest dhh/app:latest", output assert_match "docker tag dhh/app:latest dhh/app:latest", output

View File

@@ -118,14 +118,21 @@ class CommandsAccessoryTest < ActiveSupport::TestCase
test "execute in new container over ssh" do test "execute in new container over ssh" do
new_command(:mysql).stub(:run_over_ssh, ->(cmd) { cmd.join(" ") }) do new_command(:mysql).stub(:run_over_ssh, ->(cmd) { cmd.join(" ") }) do
assert_match %r{docker run -it --rm --network kamal --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env private.registry/mysql:8.0 mysql -u root}, assert_match %r{docker run -it --rm --network kamal --env MYSQL_ROOT_HOST=\"%\" --env-file .kamal/apps/app/env/accessories/mysql.env private.registry/mysql:8.0 mysql -u root},
new_command(:mysql).execute_in_new_container_over_ssh("mysql", "-u", "root") stub_stdin_tty { new_command(:mysql).execute_in_new_container_over_ssh("mysql", "-u", "root") }
end end
end end
test "execute in existing container over ssh" do test "execute in existing container over ssh" do
new_command(:mysql).stub(:run_over_ssh, ->(cmd) { cmd.join(" ") }) do new_command(:mysql).stub(:run_over_ssh, ->(cmd) { cmd.join(" ") }) do
assert_match %r{docker exec -it app-mysql mysql -u root}, assert_match %r{docker exec -it app-mysql mysql -u root},
new_command(:mysql).execute_in_existing_container_over_ssh("mysql", "-u", "root") stub_stdin_tty { new_command(:mysql).execute_in_existing_container_over_ssh("mysql", "-u", "root") }
end
end
test "execute in existing container with piped input over ssh" do
new_command(:mysql).stub(:run_over_ssh, ->(cmd) { cmd.join(" ") }) do
assert_match %r{docker exec -i app-mysql mysql -u root},
stub_stdin_file { new_command(:mysql).execute_in_existing_container_over_ssh("mysql", "-u", "root") }
end end
end end

View File

@@ -13,13 +13,13 @@ class CommandsAppTest < ActiveSupport::TestCase
test "run" do test "run" do
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run.join(" ") new_command.run.join(" ")
end end
test "run with hostname" do test "run with hostname" do
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal --hostname myhost -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --hostname myhost --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run(hostname: "myhost").join(" ") new_command.run(hostname: "myhost").join(" ")
end end
@@ -27,14 +27,14 @@ class CommandsAppTest < ActiveSupport::TestCase
@config[:volumes] = [ "/local/path:/container/path" ] @config[:volumes] = [ "/local/path:/container/path" ]
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --volume /local/path:/container/path --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --volume /local/path:/container/path --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run.join(" ") new_command.run.join(" ")
end end
test "run with custom options" do test "run with custom options" do
@config[:servers] = { "web" => [ "1.1.1.1" ], "jobs" => { "hosts" => [ "1.1.1.2" ], "cmd" => "bin/jobs", "options" => { "mount" => "somewhere", "cap-add" => true } } } @config[:servers] = { "web" => [ "1.1.1.1" ], "jobs" => { "hosts" => [ "1.1.1.2" ], "cmd" => "bin/jobs", "options" => { "mount" => "somewhere", "cap-add" => true } } }
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-jobs-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-jobs-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/jobs.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"jobs\" --label destination --mount \"somewhere\" --cap-add dhh/app:999 bin/jobs", "docker run --detach --restart unless-stopped --name app-jobs-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-jobs-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.2\" --env-file .kamal/apps/app/env/roles/jobs.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"jobs\" --label destination --mount \"somewhere\" --cap-add dhh/app:999 bin/jobs",
new_command(role: "jobs", host: "1.1.1.2").run.join(" ") new_command(role: "jobs", host: "1.1.1.2").run.join(" ")
end end
@@ -42,7 +42,7 @@ class CommandsAppTest < ActiveSupport::TestCase
@config[:logging] = { "driver" => "local", "options" => { "max-size" => "100m", "max-file" => "3" } } @config[:logging] = { "driver" => "local", "options" => { "max-size" => "100m", "max-file" => "3" } }
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/web.env --log-driver \"local\" --log-opt max-size=\"100m\" --log-opt max-file=\"3\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/roles/web.env --log-driver \"local\" --log-opt max-size=\"100m\" --log-opt max-file=\"3\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run.join(" ") new_command.run.join(" ")
end end
@@ -51,7 +51,7 @@ class CommandsAppTest < ActiveSupport::TestCase
@config[:servers] = { "web" => { "hosts" => [ "1.1.1.1" ], "logging" => { "driver" => "local", "options" => { "max-size" => "100m" } } } } @config[:servers] = { "web" => { "hosts" => [ "1.1.1.1" ], "logging" => { "driver" => "local", "options" => { "max-size" => "100m" } } } }
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env-file .kamal/apps/app/env/roles/web.env --log-driver \"local\" --log-opt max-size=\"100m\" --log-opt max-file=\"3\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env-file .kamal/apps/app/env/roles/web.env --log-driver \"local\" --log-opt max-size=\"100m\" --log-opt max-file=\"3\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run.join(" ") new_command.run.join(" ")
end end
@@ -60,7 +60,7 @@ class CommandsAppTest < ActiveSupport::TestCase
@config[:env]["tags"] = { "tag1" => { "ENV1" => "value1" } } @config[:env]["tags"] = { "tag1" => { "ENV1" => "value1" } }
assert_equal \ assert_equal \
"docker run --detach --restart unless-stopped --name app-web-999 --network kamal -e KAMAL_CONTAINER_NAME=\"app-web-999\" -e KAMAL_VERSION=\"999\" --env ENV1=\"value1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999", "docker run --detach --restart unless-stopped --name app-web-999 --network kamal --env KAMAL_CONTAINER_NAME=\"app-web-999\" --env KAMAL_VERSION=\"999\" --env KAMAL_HOST=\"1.1.1.1\" --env ENV1=\"value1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --label service=\"app\" --label role=\"web\" --label destination dhh/app:999",
new_command.run.join(" ") new_command.run.join(" ")
end end
@@ -149,8 +149,6 @@ class CommandsAppTest < ActiveSupport::TestCase
new_command.remove.join(" ") new_command.remove.join(" ")
end end
test "logs" do test "logs" do
assert_equal \ assert_equal \
"sh -c 'docker ps --latest --quiet --filter label=service=app --filter label=destination= --filter label=role=web --filter status=running --filter status=restarting --filter ancestor=$(docker image ls --filter reference=dhh/app:latest --format '\\''{{.ID}}'\\'') ; docker ps --latest --quiet --filter label=service=app --filter label=destination= --filter label=role=web --filter status=running --filter status=restarting' | head -1 | xargs docker logs --timestamps 2>&1", "sh -c 'docker ps --latest --quiet --filter label=service=app --filter label=destination= --filter label=role=web --filter status=running --filter status=restarting --filter ancestor=$(docker image ls --filter reference=dhh/app:latest --format '\\''{{.ID}}'\\'') ; docker ps --latest --quiet --filter label=service=app --filter label=destination= --filter label=role=web --filter status=running --filter status=restarting' | head -1 | xargs docker logs --timestamps 2>&1",
@@ -288,7 +286,7 @@ class CommandsAppTest < ActiveSupport::TestCase
test "execute in new container over ssh" do test "execute in new container over ssh" do
assert_match %r{docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size="10m" dhh/app:999 bin/rails c}, assert_match %r{docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size="10m" dhh/app:999 bin/rails c},
new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) stub_stdin_tty { new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) }
end end
test "execute in new container over ssh with tags" do test "execute in new container over ssh with tags" do
@@ -296,18 +294,23 @@ class CommandsAppTest < ActiveSupport::TestCase
@config[:env]["tags"] = { "tag1" => { "ENV1" => "value1" } } @config[:env]["tags"] = { "tag1" => { "ENV1" => "value1" } }
assert_equal "ssh -t root@1.1.1.1 -p 22 'docker run -it --rm --network kamal --env ENV1=\"value1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" dhh/app:999 bin/rails c'", assert_equal "ssh -t root@1.1.1.1 -p 22 'docker run -it --rm --network kamal --env ENV1=\"value1\" --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" dhh/app:999 bin/rails c'",
new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) stub_stdin_tty { new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) }
end end
test "execute in new container with custom options over ssh" do test "execute in new container with custom options over ssh" do
@config[:servers] = { "web" => { "hosts" => [ "1.1.1.1" ], "options" => { "mount" => "somewhere", "cap-add" => true } } } @config[:servers] = { "web" => { "hosts" => [ "1.1.1.1" ], "options" => { "mount" => "somewhere", "cap-add" => true } } }
assert_match %r{docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --mount \"somewhere\" --cap-add dhh/app:999 bin/rails c}, assert_match %r{docker run -it --rm --network kamal --env-file .kamal/apps/app/env/roles/web.env --log-opt max-size=\"10m\" --mount \"somewhere\" --cap-add dhh/app:999 bin/rails c},
new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) stub_stdin_tty { new_command.execute_in_new_container_over_ssh("bin/rails", "c", env: {}) }
end end
test "execute in existing container over ssh" do test "execute in existing container over ssh" do
assert_match %r{docker exec -it app-web-999 bin/rails c}, assert_match %r{docker exec -it app-web-999 bin/rails c},
new_command.execute_in_existing_container_over_ssh("bin/rails", "c", env: {}) stub_stdin_tty { new_command.execute_in_existing_container_over_ssh("bin/rails", "c", env: {}) }
end
test "execute in existing container with piped input over ssh" do
assert_match %r{docker exec -i app-web-999 bin/rails c},
stub_stdin_file { new_command.execute_in_existing_container_over_ssh("bin/rails", "c", env: {}) }
end end
test "run over ssh" do test "run over ssh" do

View File

@@ -61,6 +61,32 @@ class CommandsBuilderTest < ActiveSupport::TestCase
builder.push.join(" ") builder.push.join(" ")
end end
test "target pack when pack is set" do
builder = new_builder_command(image: "dhh/app", builder: { "arch" => "amd64", "pack" => { "builder" => "heroku/builder:24", "buildpacks" => [ "heroku/ruby", "heroku/procfile" ] } })
assert_equal "pack", builder.name
assert_equal \
"pack build dhh/app --platform linux/amd64 --creation-time now --builder heroku/builder:24 --buildpack heroku/ruby --buildpack heroku/procfile --buildpack paketo-buildpacks/image-labels -t dhh/app:123 -t dhh/app:latest --env BP_IMAGE_LABELS=service=app --path . && docker push dhh/app:123 && docker push dhh/app:latest",
builder.push.join(" ")
end
test "pack build args passed as env" do
builder = new_builder_command(image: "dhh/app", builder: { "args" => { "a" => 1, "b" => 2 }, "arch" => "amd64", "pack" => { "builder" => "heroku/builder:24", "buildpacks" => [ "heroku/ruby", "heroku/procfile" ] } })
assert_equal \
"pack build dhh/app --platform linux/amd64 --creation-time now --builder heroku/builder:24 --buildpack heroku/ruby --buildpack heroku/procfile --buildpack paketo-buildpacks/image-labels -t dhh/app:123 -t dhh/app:latest --env BP_IMAGE_LABELS=service=app --env a=\"1\" --env b=\"2\" --path . && docker push dhh/app:123 && docker push dhh/app:latest",
builder.push.join(" ")
end
test "pack build secrets as env" do
with_test_secrets("secrets" => "token_a=foo\ntoken_b=bar") do
builder = new_builder_command(image: "dhh/app", builder: { "secrets" => [ "token_a", "token_b" ], "arch" => "amd64", "pack" => { "builder" => "heroku/builder:24", "buildpacks" => [ "heroku/ruby", "heroku/procfile" ] } })
assert_equal \
"pack build dhh/app --platform linux/amd64 --creation-time now --builder heroku/builder:24 --buildpack heroku/ruby --buildpack heroku/procfile --buildpack paketo-buildpacks/image-labels -t dhh/app:123 -t dhh/app:latest --env BP_IMAGE_LABELS=service=app --env token_a=\"foo\" --env token_b=\"bar\" --path . && docker push dhh/app:123 && docker push dhh/app:latest",
builder.push.join(" ")
end
end
test "cloud builder" do test "cloud builder" do
builder = new_builder_command(builder: { "arch" => [ "#{local_arch}" ], "driver" => "cloud docker-org-name/builder-name" }) builder = new_builder_command(builder: { "arch" => [ "#{local_arch}" ], "driver" => "cloud docker-org-name/builder-name" })
assert_equal "cloud", builder.name assert_equal "cloud", builder.name

View File

@@ -16,6 +16,23 @@ class ConfigurationBuilderTest < ActiveSupport::TestCase
assert_equal false, config.builder.remote? assert_equal false, config.builder.remote?
end end
test "pack?" do
assert_not config.builder.pack?
end
test "pack? with pack builder" do
@deploy[:builder] = { "arch" => "arm64", "pack" => { "builder" => "heroku/builder:24" } }
assert config.builder.pack?
end
test "pack details" do
@deploy[:builder] = { "arch" => "amd64", "pack" => { "builder" => "heroku/builder:24", "buildpacks" => [ "heroku/ruby", "heroku/procfile" ] } }
assert_equal "heroku/builder:24", config.builder.pack_builder
assert_equal [ "heroku/ruby", "heroku/procfile" ], config.builder.pack_buildpacks
end
test "remote" do test "remote" do
assert_nil config.builder.remote assert_nil config.builder.remote
end end

View File

@@ -25,5 +25,7 @@ class ConfigurationProxyBootTest < ActiveSupport::TestCase
assert_equal "/home/kamal-proxy/.apps-config/app", @proxy_boot_config.app_container_directory assert_equal "/home/kamal-proxy/.apps-config/app", @proxy_boot_config.app_container_directory
assert_equal ".kamal/proxy/apps-config/app/error_pages", @proxy_boot_config.error_pages_directory assert_equal ".kamal/proxy/apps-config/app/error_pages", @proxy_boot_config.error_pages_directory
assert_equal "/home/kamal-proxy/.apps-config/app/error_pages", @proxy_boot_config.error_pages_container_directory assert_equal "/home/kamal-proxy/.apps-config/app/error_pages", @proxy_boot_config.error_pages_container_directory
assert_equal ".kamal/proxy/apps-config/app/tls", @proxy_boot_config.tls_directory
assert_equal "/home/kamal-proxy/.apps-config/app/tls", @proxy_boot_config.tls_container_directory
end end
end end

View File

@@ -45,6 +45,66 @@ class ConfigurationProxyTest < ActiveSupport::TestCase
end end
end end
test "ssl with certificate and private key from secrets" do
with_test_secrets("secrets" => "CERT_PEM=certificate\nKEY_PEM=private_key") do
@deploy[:proxy] = {
"ssl" => {
"certificate_pem" => "CERT_PEM",
"private_key_pem" => "KEY_PEM"
},
"host" => "example.com"
}
proxy = config.proxy
assert_equal ".kamal/proxy/apps-config/app/tls/cert.pem", proxy.host_tls_cert
assert_equal ".kamal/proxy/apps-config/app/tls/key.pem", proxy.host_tls_key
assert_equal "/home/kamal-proxy/.apps-config/app/tls/cert.pem", proxy.container_tls_cert
assert_equal "/home/kamal-proxy/.apps-config/app/tls/key.pem", proxy.container_tls_key
end
end
test "deploy options with custom ssl certificates" do
with_test_secrets("secrets" => "CERT_PEM=certificate\nKEY_PEM=private_key") do
@deploy[:proxy] = {
"ssl" => {
"certificate_pem" => "CERT_PEM",
"private_key_pem" => "KEY_PEM"
},
"host" => "example.com"
}
proxy = config.proxy
options = proxy.deploy_options
assert_equal true, options[:tls]
assert_equal "/home/kamal-proxy/.apps-config/app/tls/cert.pem", options[:"tls-certificate-path"]
assert_equal "/home/kamal-proxy/.apps-config/app/tls/key.pem", options[:"tls-private-key-path"]
end
end
test "ssl with certificate and no private key" do
with_test_secrets("secrets" => "CERT_PEM=certificate") do
@deploy[:proxy] = {
"ssl" => {
"certificate_pem" => "CERT_PEM"
},
"host" => "example.com"
}
assert_raises(Kamal::ConfigurationError) { config.proxy.ssl? }
end
end
test "ssl with private key and no certificate" do
with_test_secrets("secrets" => "KEY_PEM=private_key") do
@deploy[:proxy] = {
"ssl" => {
"private_key_pem" => "KEY_PEM"
},
"host" => "example.com"
}
assert_raises(Kamal::ConfigurationError) { config.proxy.ssl? }
end
end
private private
def config def config
Kamal::Configuration.new(@deploy) Kamal::Configuration.new(@deploy)

View File

@@ -42,6 +42,7 @@ class ConfigurationValidationTest < ActiveSupport::TestCase
assert_error "servers/web/options: should be a hash", servers: { "web" => { "options" => "" } } assert_error "servers/web/options: should be a hash", servers: { "web" => { "options" => "" } }
assert_error "servers/web/logging/options: should be a hash", servers: { "web" => { "logging" => { "options" => "" } } } assert_error "servers/web/logging/options: should be a hash", servers: { "web" => { "logging" => { "options" => "" } } }
assert_error "servers/web/logging/driver: should be a string", servers: { "web" => { "logging" => { "driver" => [] } } } assert_error "servers/web/logging/driver: should be a string", servers: { "web" => { "logging" => { "driver" => [] } } }
assert_error "servers/web/labels/service: invalid label. destination, role, and service are reserved labels", servers: { "web" => { "labels" => { "service" => "foo" } } }
assert_error "servers/web/labels: should be a hash", servers: { "web" => { "labels" => [] } } assert_error "servers/web/labels: should be a hash", servers: { "web" => { "labels" => [] } }
assert_error "servers/web/env: should be a hash", servers: { "web" => { "env" => [] } } assert_error "servers/web/env: should be a hash", servers: { "web" => { "env" => [] } }
assert_error "servers/web/env: tags are only allowed in the root env", servers: { "web" => { "hosts" => [ "1.1.1.1" ], "env" => { "tags" => {} } } } assert_error "servers/web/env: tags are only allowed in the root env", servers: { "web" => { "hosts" => [ "1.1.1.1" ], "env" => { "tags" => {} } } }
@@ -58,6 +59,7 @@ class ConfigurationValidationTest < ActiveSupport::TestCase
assert_error "accessories/accessory1: should be a hash", accessories: { "accessory1" => [] } assert_error "accessories/accessory1: should be a hash", accessories: { "accessory1" => [] }
assert_error "accessories/accessory1: unknown key: unknown", accessories: { "accessory1" => { "unknown" => "baz" } } assert_error "accessories/accessory1: unknown key: unknown", accessories: { "accessory1" => { "unknown" => "baz" } }
assert_error "accessories/accessory1/options: should be a hash", accessories: { "accessory1" => { "options" => [] } } assert_error "accessories/accessory1/options: should be a hash", accessories: { "accessory1" => { "options" => [] } }
assert_error "accessories/accessory1/labels/destination: invalid label. destination, role, and service are reserved labels", accessories: { "accessory1" => { "host" => "host", "labels" => { "destination" => "foo" } } }
assert_error "accessories/accessory1/host: should be a string", accessories: { "accessory1" => { "host" => [] } } assert_error "accessories/accessory1/host: should be a string", accessories: { "accessory1" => { "host" => [] } }
assert_error "accessories/accessory1/env: should be a hash", accessories: { "accessory1" => { "env" => [] } } assert_error "accessories/accessory1/env: should be a hash", accessories: { "accessory1" => { "env" => [] } }
assert_error "accessories/accessory1/env: tags are only allowed in the root env", accessories: { "accessory1" => { "host" => "host", "env" => { "tags" => {} } } } assert_error "accessories/accessory1/env: tags are only allowed in the root env", accessories: { "accessory1" => { "host" => "host", "env" => { "tags" => {} } } }
@@ -94,6 +96,7 @@ class ConfigurationValidationTest < ActiveSupport::TestCase
assert_error "builder/arch: should be an array or a string", builder: { "arch" => {} } assert_error "builder/arch: should be an array or a string", builder: { "arch" => {} }
assert_error "builder/args: should be a hash", builder: { "args" => [ "foo" ] } assert_error "builder/args: should be a hash", builder: { "args" => [ "foo" ] }
assert_error "builder/cache/options: should be a string", builder: { "cache" => { "options" => [] } } assert_error "builder/cache/options: should be a string", builder: { "cache" => { "options" => [] } }
assert_error "builder: buildpacks only support building for one arch", builder: { "arch" => [ "amd64", "arm64" ], "pack" => { "builder" => "heroku/builder:24" } }
end end
private private

View File

@@ -386,7 +386,7 @@ class ConfigurationTest < ActiveSupport::TestCase
Kamal::Configuration.new(@deploy_with_roles) Kamal::Configuration.new(@deploy_with_roles)
end end
assert_equal "SSL is only supported on a single server, found 2 servers for role workers", exception.message assert_equal "SSL is only supported on a single server unless you provide custom certificates, found 2 servers for role workers", exception.message
end end
test "two proxy ssl roles with same host" do test "two proxy ssl roles with same host" do

View File

@@ -41,6 +41,8 @@ services:
context: docker/vm context: docker/vm
volumes: volumes:
- shared:/shared - shared:/shared
ports:
- "22443:443"
vm2: vm2:
privileged: true privileged: true
@@ -61,6 +63,7 @@ services:
context: docker/load_balancer context: docker/load_balancer
ports: ports:
- "12345:80" - "12345:80"
- "12443:443"
depends_on: depends_on:
- vm1 - vm1
- vm2 - vm2

View File

@@ -18,6 +18,7 @@ RUN apt-get update --fix-missing && apt-get install -y docker-ce docker-ce-cli c
COPY *.sh . COPY *.sh .
COPY app/ app/ COPY app/ app/
COPY app_with_custom_certificate/ app_with_custom_certificate/
COPY app_with_roles/ app_with_roles/ COPY app_with_roles/ app_with_roles/
COPY app_with_traefik/ app_with_traefik/ COPY app_with_traefik/ app_with_traefik/
COPY app_with_proxied_accessory/ app_with_proxied_accessory/ COPY app_with_proxied_accessory/ app_with_proxied_accessory/
@@ -29,6 +30,7 @@ RUN mkdir -p /etc/docker/certs.d/registry:4443 && ln -s /shared/certs/domain.crt
RUN git config --global user.email "deployer@example.com" RUN git config --global user.email "deployer@example.com"
RUN git config --global user.name "Deployer" RUN git config --global user.name "Deployer"
RUN cd app && git init && git add . && git commit -am "Initial version" RUN cd app && git init && git add . && git commit -am "Initial version"
RUN cd app_with_custom_certificate && git init && git add . && git commit -am "Initial version"
RUN cd app_with_roles && git init && git add . && git commit -am "Initial version" RUN cd app_with_roles && git init && git add . && git commit -am "Initial version"
RUN cd app_with_traefik && git init && git add . && git commit -am "Initial version" RUN cd app_with_traefik && git init && git add . && git commit -am "Initial version"
RUN cd app_with_proxied_accessory && git init && git add . && git commit -am "Initial version" RUN cd app_with_proxied_accessory && git init && git add . && git commit -am "Initial version"

View File

@@ -0,0 +1,2 @@
CUSTOM_CERT=$(cat certs/cert.pem)
CUSTOM_KEY=$(cat certs/key.pem)

View File

@@ -0,0 +1,10 @@
FROM registry:4443/nginx:1-alpine-slim
COPY default.conf /etc/nginx/conf.d/default.conf
ARG COMMIT_SHA
RUN echo $COMMIT_SHA > /usr/share/nginx/html/version
RUN mkdir -p /usr/share/nginx/html/versions && echo "version" > /usr/share/nginx/html/versions/$COMMIT_SHA
RUN mkdir -p /usr/share/nginx/html/versions && echo "hidden" > /usr/share/nginx/html/versions/.hidden
RUN echo "Up!" > /usr/share/nginx/html/up

View File

@@ -0,0 +1,19 @@
-----BEGIN CERTIFICATE-----
MIIDCzCCAfOgAwIBAgIUJHOADjhddzCAdXFfZvhXAsVMwhowDQYJKoZIhvcNAQEL
BQAwFDESMBAGA1UEAwwJbG9jYWxob3N0MCAXDTI1MDYxNzA5MDYxOVoYDzIxMjUw
NTI0MDkwNjE5WjAUMRIwEAYDVQQDDAlsb2NhbGhvc3QwggEiMA0GCSqGSIb3DQEB
AQUAA4IBDwAwggEKAoIBAQDQaLWwoLZ3/cZdiW/m4pqOe228wCx/CRU9/E2AT9NS
ofuJNtUaxw7QAAFEWIrnf9y3M09lZeox1CNmXe2GADnnx/n906zSGX18SdDmWrxa
L/1t5OZiXl3we5PM3UNvbFPSq1MCnOtvo6jTPM7shIpJ/5/KuuqovyrO31VCnc2+
ycEzJ2BOcKFUFAeyT/8bk9lAI+1971PLqC6ut9dfy8PVHSPyGrxGiQCpStU7NiQj
LUkqte7x9GcIKTJUjMkWIsvGke9oGoGgEl5gEfqxFAs3ZkA1aYkiHhwFtrUkGOOf
O1C6sqfwnnAhtG8LnULGlFYi3GoKALF2XSIagGpaQM5HAgMBAAGjUzBRMB0GA1Ud
DgQWBBQg2m871YSI220bQEG5APeGzeaz4zAfBgNVHSMEGDAWgBQg2m871YSI220b
QEG5APeGzeaz4zAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQBc
yQvjLV+Uym+SI/bmKNKafW7ioWSkWAfTl/bvCB8xCX2OJsSqh1vjiKhkcJ6t0Tcj
cEiYs7Q+2NVC+s+0ztrN1y4Ve8iX9K9D6o/09bD23zTKpftxCMv8NqoBicNVJ7O9
sINcTqzrIPb+jawE47ogNvlorsU1hi1GTmDHtIqVJPQwiNCIWd8frBLf+WfCHCCK
xRJb4hh5wR05v94L0/QdfKQ8qqCRG0VLyoGGcUyQgC8PLLlHRIWIYuwo3xhUK9nN
Gn8WNiACY4ry1wRauqIp54N3fM1a5sgzpgPKc8++KLVBpxhDy8nRoFAD0k6y1iM0
2EoVLhbMvwhYwHOHkktp
-----END CERTIFICATE-----

View File

@@ -0,0 +1,28 @@
-----BEGIN PRIVATE KEY-----
MIIEvQIBADANBgkqhkiG9w0BAQEFAASCBKcwggSjAgEAAoIBAQDQaLWwoLZ3/cZd
iW/m4pqOe228wCx/CRU9/E2AT9NSofuJNtUaxw7QAAFEWIrnf9y3M09lZeox1CNm
Xe2GADnnx/n906zSGX18SdDmWrxaL/1t5OZiXl3we5PM3UNvbFPSq1MCnOtvo6jT
PM7shIpJ/5/KuuqovyrO31VCnc2+ycEzJ2BOcKFUFAeyT/8bk9lAI+1971PLqC6u
t9dfy8PVHSPyGrxGiQCpStU7NiQjLUkqte7x9GcIKTJUjMkWIsvGke9oGoGgEl5g
EfqxFAs3ZkA1aYkiHhwFtrUkGOOfO1C6sqfwnnAhtG8LnULGlFYi3GoKALF2XSIa
gGpaQM5HAgMBAAECggEAM2dIPRb+uozU8vg1qhCFR5RpBi+uKe0vGJlU8kt+F3kN
hhQIrvCfFi2SIm3mYOAYK/WTZTKkd4LX8mVDcxQ2NBWOcw1VKIMSAOhiBpclsub4
TrUxH90ftXN9in+epOpmqGUKdfAHYANRXjy22v5773GF06aTv2hbYigSqvoqJ57A
PCdpw9q9sTwJqR9reU3f9fHsUyIwLCQpbtFyQc8aU9LHqgs4SAkaogY+4mPmlCrl
pQ5wGljTXmK5g1o/v+mu1WdeGNOzd5//xp0YImkGtyiqh8Ab891MI1wPgivNP5Lo
Ru1wKhegj89XamT/LUCtn6NCcokE/9pqEXrKK7JeVQKBgQD98kGUkdAm+zHjRZsr
KTeQQ/wszFrNcbP9irE5MqnASWskIXcAhGVJrqbtinLPLIeT22BTsJkCUjVJdfX2
MObjiJP0LMrMVpGQC0b+i4boS8W/lY5T4fM97B+ILc3Y1OYiUedg0gVsFspSR4ef
luNfbKbmdzYYqFz6a/q5vExqBQKBgQDSGC2MJXYAewRJ9Mk3fNvll/6yz73rGCct
tljwNXUgC7y2nEabDverPd74olSxojQwus/kA8JrMVa2IkXo+lKAwLV+nyj3PGHw
3szTeAVWrGIRveWuW6IQ5zOP2IGkX5Jm+XSPVihnMz7SZA6k6qCtWVVywfBubSpi
1dMNWAhs2wKBgBvMVw1yYLzDppRgXDn/SwvJxWMKA66VkcRhWEEQoLBh2Q6dcy9l
TskgCznZe/PdxgGTdBn1LOqqIRcniIMomz2xB7Ek7hYsK8b+1QisMVpgYQc10dyw
0TWoEVOQ4AWqWH7NRGy+0MUiQYd8OQZpN/6MIED+L7fHRlZLV6jZSewZAoGBAJwo
bHJmxbbFuQJfd9BOdgPJXf76emdrpHNNvf2NPml7T+FLdw95qI0Xh8u2nM0Li09N
C4inYrLaEWF/SAdLSFd65WwgUQqzTvkCIaxs4UrzBlG5nCZk5ak6sBCTFIlgoCj5
8bE4kP9kD6XByUC7RIKUi/aoQFVTvtWHqT+Z12lRAoGAAVoZVxE+xPAfzVyAatpH
M8WwgB23r07thNDiJCUMOQUT8LRFKg/Hyj6jB2W7gj669G/Bvoar++nXJVw7QCiv
MlOk1pfaKuW82rCPnTeUzJwf2KQ8Jg2avasD4GFWZBJVvlHN1ONySViIpb67hhAK
1OcbfGutFiGWhUwXNVkVc4U=
-----END PRIVATE KEY-----

View File

@@ -0,0 +1,36 @@
service: app_with_custom_certificate
image: app_with_custom_certificate
servers:
web:
hosts:
- vm1
- vm2
workers:
hosts:
- vm3
cmd: sleep infinity
deploy_timeout: 2
drain_timeout: 2
readiness_delay: 0
proxy:
host: localhost
ssl:
certificate_pem: CUSTOM_CERT
private_key_pem: CUSTOM_KEY
healthcheck:
interval: 1
timeout: 1
path: "/up"
asset_path: /usr/share/nginx/html/versions
registry:
server: registry:4443
username: root
password: root
builder:
driver: docker
arch: <%= Kamal::Utils.docker_arch %>
args:
COMMIT_SHA: <%= `git rev-parse HEAD` %>

View File

@@ -0,0 +1,17 @@
server {
listen 80;
listen [::]:80;
server_name localhost;
location / {
root /usr/share/nginx/html;
index index.html index.htm;
}
# redirect server error pages to the static page /50x.html
#
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
}

View File

@@ -63,8 +63,8 @@ class IntegrationTest < ActiveSupport::TestCase
assert_match message, response.body.strip if message assert_match message, response.body.strip if message
end end
def assert_app_is_up(version: nil, app: @app) def assert_app_is_up(version: nil, app: @app, cert: nil)
response = app_response(app: app) response = app_response(app: app, cert: cert)
debug_response_code(response, "200") debug_response_code(response, "200")
assert_equal "200", response.code assert_equal "200", response.code
assert_app_version(version, response) if version assert_app_version(version, response) if version
@@ -82,8 +82,14 @@ class IntegrationTest < ActiveSupport::TestCase
assert_equal up_times, up_count assert_equal up_times, up_count
end end
def app_response(app: @app) def app_response(app: @app, cert: nil)
Net::HTTP.get_response(URI.parse("http://#{app_host(app)}:12345/version")) uri = cert ? URI.parse("https://#{app_host(app)}:22443/version") : URI.parse("http://#{app_host(app)}:12345/version")
if cert
https_response_with_cert(uri, cert)
else
Net::HTTP.get_response(uri)
end
end end
def update_app_rev def update_app_rev
@@ -186,4 +192,19 @@ class IntegrationTest < ActiveSupport::TestCase
"localhost" "localhost"
end end
end end
def https_response_with_cert(uri, cert)
host = uri.host
port = uri.port
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
store = OpenSSL::X509::Store.new
store.add_cert(OpenSSL::X509::Certificate.new(File.read(cert)))
http.cert_store = store
request = Net::HTTP::Get.new(uri)
http.request(request)
end
end end

View File

@@ -142,8 +142,19 @@ class MainTest < IntegrationTest
assert_app_is_up version: first_version assert_app_is_up version: first_version
end end
test "deploy with a custom certificate" do
@app = "app_with_custom_certificate"
first_version = latest_app_version
kamal :setup
assert_app_is_up version: first_version, cert: "test/integration/docker/deployer/app_with_custom_certificate/certs/cert.pem"
end
private private
def assert_envs(version:) def assert_envs(version:)
assert_env :KAMAL_HOST, "vm1", version: version, vm: :vm1
assert_env :CLEAR_TOKEN, "4321", version: version, vm: :vm1 assert_env :CLEAR_TOKEN, "4321", version: version, vm: :vm1
assert_env :HOST_TOKEN, "abcd", version: version, vm: :vm1 assert_env :HOST_TOKEN, "abcd", version: version, vm: :vm1
assert_env :SECRET_TOKEN, "1234 with \"中文\"", version: version, vm: :vm1 assert_env :SECRET_TOKEN, "1234 with \"中文\"", version: version, vm: :vm1

View File

@@ -15,57 +15,111 @@ class BitwardenSecretsManagerAdapterTest < SecretAdapterTestCase
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks stub_ticks
.with("bws secret list -o env") .with("bws secret list")
.returns("KAMAL_REGISTRY_PASSWORD=\"some_password\"\nMY_OTHER_SECRET=\"my=weird\"secret\"") .returns(<<~JSON)
[
{
"key": "KAMAL_REGISTRY_PASSWORD",
"value": "some_password"
},
{
"key": "MY_OTHER_SECRET",
"value": "my=wierd\\"secret"
}
]
JSON
expected = '{"KAMAL_REGISTRY_PASSWORD":"some_password","MY_OTHER_SECRET":"my\=weird\"secret"}' json = JSON.parse(shellunescape(run_command("fetch", "all")))
actual = shellunescape(run_command("fetch", "all"))
assert_equal expected, actual expected_json = {
"KAMAL_REGISTRY_PASSWORD"=>"some_password",
"MY_OTHER_SECRET"=>"my=wierd\"secret"
}
assert_equal expected_json, json
end end
test "fetch all with from" do test "fetch all with from" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks stub_ticks
.with("bws secret list -o env 82aeb5bd-6958-4a89-8197-eacab758acce") .with("bws secret list 82aeb5bd-6958-4a89-8197-eacab758acce")
.returns("KAMAL_REGISTRY_PASSWORD=\"some_password\"\nMY_OTHER_SECRET=\"my=weird\"secret\"") .returns(<<~JSON)
[
{
"key": "KAMAL_REGISTRY_PASSWORD",
"value": "some_password"
},
{
"key": "MY_OTHER_SECRET",
"value": "my=wierd\\"secret"
}
]
JSON
expected = '{"KAMAL_REGISTRY_PASSWORD":"some_password","MY_OTHER_SECRET":"my\=weird\"secret"}' json = JSON.parse(shellunescape(run_command("fetch", "all", "--from", "82aeb5bd-6958-4a89-8197-eacab758acce")))
actual = shellunescape(run_command("fetch", "all", "--from", "82aeb5bd-6958-4a89-8197-eacab758acce"))
assert_equal expected, actual expected_json = {
"KAMAL_REGISTRY_PASSWORD"=>"some_password",
"MY_OTHER_SECRET"=>"my=wierd\"secret"
}
assert_equal expected_json, json
end end
test "fetch item" do test "fetch item" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks stub_ticks
.with("bws secret get -o env 82aeb5bd-6958-4a89-8197-eacab758acce") .with("bws secret get 82aeb5bd-6958-4a89-8197-eacab758acce")
.returns("KAMAL_REGISTRY_PASSWORD=\"some_password\"") .returns(<<~JSON)
{
"key": "KAMAL_REGISTRY_PASSWORD",
"value": "some_password"
}
JSON
expected = '{"KAMAL_REGISTRY_PASSWORD":"some_password"}' json = JSON.parse(shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce")))
actual = shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce")) expected_json = {
assert_equal expected, actual "KAMAL_REGISTRY_PASSWORD"=>"some_password"
}
assert_equal expected_json, json
end end
test "fetch with multiple items" do test "fetch with multiple items" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks stub_ticks
.with("bws secret get -o env 82aeb5bd-6958-4a89-8197-eacab758acce") .with("bws secret get 82aeb5bd-6958-4a89-8197-eacab758acce")
.returns("KAMAL_REGISTRY_PASSWORD=\"some_password\"") .returns(<<~JSON)
{
"key": "KAMAL_REGISTRY_PASSWORD",
"value": "some_password"
}
JSON
stub_ticks stub_ticks
.with("bws secret get -o env 6f8cdf27-de2b-4c77-a35d-07df8050e332") .with("bws secret get 6f8cdf27-de2b-4c77-a35d-07df8050e332")
.returns("MY_OTHER_SECRET=\"my=weird\"secret\"") .returns(<<~JSON)
{
"key": "MY_OTHER_SECRET",
"value": "my=wierd\\"secret"
}
JSON
expected = '{"KAMAL_REGISTRY_PASSWORD":"some_password","MY_OTHER_SECRET":"my\=weird\"secret"}' json = JSON.parse(shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce", "6f8cdf27-de2b-4c77-a35d-07df8050e332")))
actual = shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce", "6f8cdf27-de2b-4c77-a35d-07df8050e332")) expected_json = {
assert_equal expected, actual "KAMAL_REGISTRY_PASSWORD"=>"some_password",
"MY_OTHER_SECRET"=>"my=wierd\"secret"
}
assert_equal expected_json, json
end end
test "fetch all empty" do test "fetch all empty" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks_with("bws secret list -o env", succeed: false).returns("Error:\n0: Received error message from server") stub_ticks_with("bws secret list", succeed: false).returns("Error:\n0: Received error message from server")
error = assert_raises RuntimeError do error = assert_raises RuntimeError do
(shellunescape(run_command("fetch", "all"))) (shellunescape(run_command("fetch", "all")))
@@ -76,8 +130,8 @@ class BitwardenSecretsManagerAdapterTest < SecretAdapterTestCase
test "fetch nonexistent item" do test "fetch nonexistent item" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_login stub_login
stub_ticks_with("bws secret get -o env 82aeb5bd-6958-4a89-8197-eacab758acce", succeed: false) stub_ticks_with("bws secret get 82aeb5bd-6958-4a89-8197-eacab758acce", succeed: false)
.returns("ERROR (RuntimeError): Could not read 82aeb5bd-6958-4a89-8197-eacab758acce from Bitwarden Secrets Manager") .returns("Error:\n0: Received error message from server")
error = assert_raises RuntimeError do error = assert_raises RuntimeError do
(shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce"))) (shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce")))
@@ -85,9 +139,29 @@ class BitwardenSecretsManagerAdapterTest < SecretAdapterTestCase
assert_equal("Could not read 82aeb5bd-6958-4a89-8197-eacab758acce from Bitwarden Secrets Manager", error.message) assert_equal("Could not read 82aeb5bd-6958-4a89-8197-eacab758acce from Bitwarden Secrets Manager", error.message)
end end
test "fetch item with linebreak in value" do
stub_ticks.with("bws --version 2> /dev/null")
stub_login
stub_ticks
.with("bws secret get 82aeb5bd-6958-4a89-8197-eacab758acce")
.returns(<<~JSON)
{
"key": "SSH_PRIVATE_KEY",
"value": "some_key\\nwith_linebreak"
}
JSON
json = JSON.parse(shellunescape(run_command("fetch", "82aeb5bd-6958-4a89-8197-eacab758acce")))
expected_json = {
"SSH_PRIVATE_KEY"=>"some_key\nwith_linebreak"
}
assert_equal expected_json, json
end
test "fetch with no access token" do test "fetch with no access token" do
stub_ticks.with("bws --version 2> /dev/null") stub_ticks.with("bws --version 2> /dev/null")
stub_ticks_with("bws run 'echo OK'", succeed: false) stub_ticks_with("bws project list", succeed: false)
error = assert_raises RuntimeError do error = assert_raises RuntimeError do
(shellunescape(run_command("fetch", "all"))) (shellunescape(run_command("fetch", "all")))
@@ -106,7 +180,7 @@ class BitwardenSecretsManagerAdapterTest < SecretAdapterTestCase
private private
def stub_login def stub_login
stub_ticks.with("bws run 'echo OK'").returns("OK") stub_ticks.with("bws project list").returns("OK")
end end
def run_command(*command) def run_command(*command)

View File

@@ -6,7 +6,7 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
stub_ticks.with("op account get --account myaccount 2> /dev/null") stub_ticks.with("op account get --account myaccount 2> /dev/null")
stub_ticks stub_ticks
.with("op item get myitem --vault \"myvault\" --fields \"label=section.SECRET1,label=section.SECRET2,label=section2.SECRET3\" --format \"json\" --account \"myaccount\"") .with("op item get myitem --vault \"myvault\" --format \"json\" --account \"myaccount\" --fields \"label=section.SECRET1,label=section.SECRET2,label=section2.SECRET3\"")
.returns(<<~JSON) .returns(<<~JSON)
[ [
{ {
@@ -61,7 +61,7 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
stub_ticks.with("op account get --account myaccount 2> /dev/null") stub_ticks.with("op account get --account myaccount 2> /dev/null")
stub_ticks stub_ticks
.with("op item get myitem --vault \"myvault\" --fields \"label=section.SECRET1,label=section.SECRET2\" --format \"json\" --account \"myaccount\"") .with("op item get myitem --vault \"myvault\" --format \"json\" --account \"myaccount\" --fields \"label=section.SECRET1,label=section.SECRET2\"")
.returns(<<~JSON) .returns(<<~JSON)
[ [
{ {
@@ -90,7 +90,7 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
JSON JSON
stub_ticks stub_ticks
.with("op item get myitem2 --vault \"myvault\" --fields \"label=section2.SECRET3\" --format \"json\" --account \"myaccount\"") .with("op item get myitem2 --vault \"myvault\" --format \"json\" --account \"myaccount\" --fields \"label=section2.SECRET3\"")
.returns(<<~JSON) .returns(<<~JSON)
{ {
"id": "aaaaaaaaaaaaaaaaaaaaaaaaaa", "id": "aaaaaaaaaaaaaaaaaaaaaaaaaa",
@@ -116,6 +116,63 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
assert_equal expected_json, json assert_equal expected_json, json
end end
test "fetch all fields" do
stub_ticks.with("op --version 2> /dev/null")
stub_ticks.with("op account get --account myaccount 2> /dev/null")
stub_ticks
.with("op item get myitem --vault \"myvault\" --format \"json\" --account \"myaccount\"")
.returns(<<~JSON)
{
"id": "ucbtiii777",
"title": "A title",
"version": 45,
"vault": {
"id": "vu7ki98do",
"name": "Vault"
},
"category": "LOGIN",
"last_edited_by": "ABCT3684BC",
"created_at": "2025-05-22T06:47:01Z",
"updated_at": "2025-05-22T00:36:48.02598-07:00",
"additional_information": "",
"fields": [
{
"id": "aaaaaaaaaaaaaaaaaaaaaaaaaa",
"section": {
"id": "cccccccccccccccccccccccccc",
"label": "section"
},
"type": "CONCEALED",
"label": "SECRET1",
"value": "VALUE1",
"reference": "op://myvault/myitem/section/SECRET1"
},
{
"id": "bbbbbbbbbbbbbbbbbbbbbbbbbb",
"section": {
"id": "cccccccccccccccccccccccccc",
"label": "section"
},
"type": "CONCEALED",
"label": "SECRET2",
"value": "VALUE2",
"reference": "op://myvault/myitem/section/SECRET2"
}
]
}
JSON
json = JSON.parse(shellunescape(run_command("fetch", "--from", "op://myvault/myitem")))
expected_json = {
"myvault/myitem/section/SECRET1"=>"VALUE1",
"myvault/myitem/section/SECRET2"=>"VALUE2"
}
assert_equal expected_json, json
end
test "fetch with signin, no session" do test "fetch with signin, no session" do
stub_ticks.with("op --version 2> /dev/null") stub_ticks.with("op --version 2> /dev/null")
@@ -123,7 +180,7 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
stub_ticks_with("op signin --account \"myaccount\" --force --raw", succeed: true).returns("") stub_ticks_with("op signin --account \"myaccount\" --force --raw", succeed: true).returns("")
stub_ticks stub_ticks
.with("op item get myitem --vault \"myvault\" --fields \"label=section.SECRET1\" --format \"json\" --account \"myaccount\"") .with("op item get myitem --vault \"myvault\" --format \"json\" --account \"myaccount\" --fields \"label=section.SECRET1\"")
.returns(single_item_json) .returns(single_item_json)
json = JSON.parse(shellunescape(run_command("fetch", "--from", "op://myvault/myitem", "section/SECRET1"))) json = JSON.parse(shellunescape(run_command("fetch", "--from", "op://myvault/myitem", "section/SECRET1")))
@@ -142,7 +199,7 @@ class SecretsOnePasswordAdapterTest < SecretAdapterTestCase
stub_ticks_with("op signin --account \"myaccount\" --force --raw", succeed: true).returns("1234567890") stub_ticks_with("op signin --account \"myaccount\" --force --raw", succeed: true).returns("1234567890")
stub_ticks stub_ticks
.with("op item get myitem --vault \"myvault\" --fields \"label=section.SECRET1\" --format \"json\" --account \"myaccount\" --session \"1234567890\"") .with("op item get myitem --vault \"myvault\" --format \"json\" --account \"myaccount\" --session \"1234567890\" --fields \"label=section.SECRET1\"")
.returns(single_item_json) .returns(single_item_json)
json = JSON.parse(shellunescape(run_command("fetch", "--from", "op://myvault/myitem", "section/SECRET1"))) json = JSON.parse(shellunescape(run_command("fetch", "--from", "op://myvault/myitem", "section/SECRET1")))

View File

@@ -0,0 +1,474 @@
require "test_helper"
class PassboltAdapterTest < SecretAdapterTestCase
setup do
`true` # Ensure $? is 0
end
test "fetch" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list resources --filter 'Name == \"SECRET1\" || Name == \"FSECRET1\" || Name == \"FSECRET2\"' --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "SECRET1", "FSECRET1", "FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch with --from" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list folders --filter 'Name == \"my-project\"' --json")
.returns(<<~JSON)
[
{
"id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"folder_parent_id": "",
"name": "my-project",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list resources --filter '(Name == \"SECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET2\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\")' --folder dcbe0e39-42d8-42db-9637-8256b9f2f8e3 --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "--from", "my-project", "SECRET1", "FSECRET1", "FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch with folder in secret" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list folders --filter 'Name == \"my-project\"' --json")
.returns(<<~JSON)
[
{
"id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"folder_parent_id": "",
"name": "my-project",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list resources --filter '(Name == \"SECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET2\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\")' --folder dcbe0e39-42d8-42db-9637-8256b9f2f8e3 --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "my-project/SECRET1", "my-project/FSECRET1", "my-project/FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch from multiple folders" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list folders --filter 'Name == \"my-project\" || Name == \"other-project\"' --json")
.returns(<<~JSON)
[
{
"id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"folder_parent_id": "",
"name": "my-project",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
},
{
"id": "14e11dd8-b279-4689-8bd9-fa33ebb527da",
"folder_parent_id": "",
"name": "other-project",
"created_timestamp": "2025-02-21T20:00:29Z",
"modified_timestamp": "2025-02-21T20:00:29Z"
}
]
JSON
stub_ticks
.with("passbolt list resources --filter '(Name == \"SECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET1\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\") || (Name == \"FSECRET2\" && FolderParentID == \"14e11dd8-b279-4689-8bd9-fa33ebb527da\")' --folder dcbe0e39-42d8-42db-9637-8256b9f2f8e3 --folder 14e11dd8-b279-4689-8bd9-fa33ebb527da --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "14e11dd8-b279-4689-8bd9-fa33ebb527da",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "my-project/SECRET1", "my-project/FSECRET1", "other-project/FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch from nested folder" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list folders --filter 'Name == \"my-project\"' --json")
.returns(<<~JSON)
[
{
"id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"folder_parent_id": "",
"name": "my-project",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list folders --filter 'Name == \"subfolder\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\"' --json")
.returns(<<~JSON)
[
{
"id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "subfolder",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list resources --filter '(Name == \"SECRET1\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\") || (Name == \"FSECRET1\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\") || (Name == \"FSECRET2\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\")' --folder dcbe0e39-42d8-42db-9637-8256b9f2f8e3 --folder 6a3f21fc-aa40-4ba9-852c-7477fdd0310d --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "--from", "my-project/subfolder", "SECRET1", "FSECRET1", "FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch from nested folder in secret" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: true)
stub_ticks.with("passbolt verify 2> /dev/null", succeed: true)
stub_ticks
.with("passbolt list folders --filter 'Name == \"my-project\"' --json")
.returns(<<~JSON)
[
{
"id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"folder_parent_id": "",
"name": "my-project",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list folders --filter 'Name == \"subfolder\" && FolderParentID == \"dcbe0e39-42d8-42db-9637-8256b9f2f8e3\"' --json")
.returns(<<~JSON)
[
{
"id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"folder_parent_id": "dcbe0e39-42d8-42db-9637-8256b9f2f8e3",
"name": "subfolder",
"created_timestamp": "2025-02-21T19:52:50Z",
"modified_timestamp": "2025-02-21T19:52:50Z"
}
]
JSON
stub_ticks
.with("passbolt list resources --filter '(Name == \"SECRET1\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\") || (Name == \"FSECRET1\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\") || (Name == \"FSECRET2\" && FolderParentID == \"6a3f21fc-aa40-4ba9-852c-7477fdd0310d\")' --folder dcbe0e39-42d8-42db-9637-8256b9f2f8e3 --folder 6a3f21fc-aa40-4ba9-852c-7477fdd0310d --json")
.returns(<<~JSON)
[
{
"id": "4c116996-f6d0-4342-9572-0d676f75b3ac",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "FSECRET1",
"username": "",
"uri": "",
"password": "fsecret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:29Z",
"modified_timestamp": "2025-02-21T06:04:29Z"
},
{
"id": "62949b26-4957-43fe-9523-294d66861499",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "FSECRET2",
"username": "",
"uri": "",
"password": "fsecret2",
"description": "",
"created_timestamp": "2025-02-21T06:04:34Z",
"modified_timestamp": "2025-02-21T06:04:34Z"
},
{
"id": "dd32963c-0db5-4303-a6fc-22c5229dabef",
"folder_parent_id": "6a3f21fc-aa40-4ba9-852c-7477fdd0310d",
"name": "SECRET1",
"username": "",
"uri": "",
"password": "secret1",
"description": "",
"created_timestamp": "2025-02-21T06:04:23Z",
"modified_timestamp": "2025-02-21T06:04:23Z"
}
]
JSON
json = JSON.parse(
shellunescape run_command("fetch", "my-project/subfolder/SECRET1", "my-project/subfolder/FSECRET1", "my-project/subfolder/FSECRET2")
)
expected_json = {
"SECRET1"=>"secret1",
"FSECRET1"=>"fsecret1",
"FSECRET2"=>"fsecret2"
}
assert_equal expected_json, json
end
test "fetch without CLI installed" do
stub_ticks_with("passbolt --version 2> /dev/null", succeed: false)
error = assert_raises RuntimeError do
JSON.parse(shellunescape(run_command("fetch", "HOST", "PORT")))
end
assert_equal "Passbolt CLI is not installed", error.message
end
private
def run_command(*command)
stdouted do
Kamal::Cli::Secrets.start \
[ *command,
"-c", "test/fixtures/deploy_with_accessories.yml",
"--adapter", "passbolt" ]
end
end
end

View File

@@ -3,6 +3,7 @@ require "active_support/test_case"
require "active_support/testing/autorun" require "active_support/testing/autorun"
require "active_support/testing/stream" require "active_support/testing/stream"
require "rails/test_unit/line_filtering" require "rails/test_unit/line_filtering"
require "pty"
require "debug" require "debug"
require "mocha/minitest" # using #stubs that can alter returns require "mocha/minitest" # using #stubs that can alter returns
require "minitest/autorun" # using #stub that take args require "minitest/autorun" # using #stub that take args
@@ -48,6 +49,27 @@ class ActiveSupport::TestCase
capture(:stderr) { yield }.strip capture(:stderr) { yield }.strip
end end
def stub_stdin_tty
PTY.open do |master, slave|
stub_stdin(master) { yield }
end
end
def stub_stdin_file
File.open("/dev/null", "r") do |file|
stub_stdin(file) { yield }
end
end
def stub_stdin(io)
original_stdin = STDIN.dup
STDIN.reopen(io)
yield
ensure
STDIN.reopen(original_stdin)
original_stdin.close
end
def with_test_secrets(**files) def with_test_secrets(**files)
setup_test_secrets(**files) setup_test_secrets(**files)
yield yield