Skip to content

Conversation

@eugeneswalker
Copy link
Contributor

@spackbot-triage spackbot-triage bot added the ci Issues related to Continuous Integration label Jan 8, 2026
@zackgalbreath
Copy link
Contributor

It tooks like we have these stacks disabled both within .gitlab-ci.yml and within the GitLab web UI. I'm going to run a new pipeline for this branch with the darwin stacks re-enabled now to verify if they still build successfully. If not, I would argue that we should also fix any regressions in these stacks as part of this PR.

@zackgalbreath
Copy link
Contributor

Looks like we'll need to update some tags too?

Screenshot 2026-01-08 at 9 57 18 AM

Copy link
Contributor

@zackgalbreath zackgalbreath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need to update tags as well, see screenshot

@eugeneswalker
Copy link
Contributor Author

need to update tags as well, see screenshot

Just pushed this update. Thanks for pointing this out.

@eugeneswalker
Copy link
Contributor Author

Our Tahoe upgrade seems to be a problem:

...
$ spack -v  --color=always ci generate --check-index-only -j ${SPACK_CONCRETIZE_JOBS} --forward-variable SPACK_CHECKOUT_VERSION --forward-variable SPACK_CHECKOUT_REPO --forward-variable SPACK_CI_PACKAGES_ROOT --forward-variable SPACK_CI_SPACK_ROOT --artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir/${SPACK_CI_STACK_NAME}" --output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/${SPACK_CI_STACK_NAME}/cloud-ci-pipeline.yml"
...
Traceback (most recent call last):
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/bin/spack", line 49, in <module>
    sys.exit(main())
             ~~~~^^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/main.py", line 1123, in main
    return _main(argv)
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/main.py", line 1075, in _main
    return finish_parse_and_run(parser, cmd_name, args, env_format_error)
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/main.py", line 1106, in finish_parse_and_run
    return _invoke_command(command, parser, args, unknown)
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/main.py", line 649, in _invoke_command
    return_val = command(parser, args)
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/cmd/ci.py", line 891, in ci
    return args.func(args)
           ~~~~~~~~~^^^^^^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/cmd/ci.py", line 257, in ci_generate
    spack_ci.generate_pipeline(env, args)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/ci/__init__.py", line 477, in generate_pipeline
    env.concretize()
    ~~~~~~~~~~~~~~^^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/environment/environment.py", line 1566, in concretize
    return self._concretize_separately(tests=tests)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/environment/environment.py", line 1716, in _concretize_separately
    concretized_specs = spack.concretize.concretize_separately(to_concretize, tests=tests)
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/concretize.py", line 157, in concretize_separately
    for j, (i, concrete, duration) in enumerate(
                                      ~~~~~~~~~^
        spack.util.parallel.imap_unordered(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            _concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        )
        ^
    ):
    ^
  File "/Users/gitlab-runner-p-1/builds/t1_ZipHk_/0/spack/spack-packages/.ci/tmp/spack/lib/spack/spack/util/parallel.py", line 92, in imap_unordered
    raise RuntimeError(result.stacktrace if debug else str(result))
RuntimeError: failed to concretize `py-tensorflow-probability` for the following reasons:
     1. bazel: 'os=tahoe' conflicts with '@:7.6.1,8:8.4.1'
     2. bazel: 'os=tahoe' conflicts with '@:7.6,8:8.4.1'
     3. bazel: 'os=tahoe' conflicts with '@:7.6.1,8:8.4.1'
        required because conflict constraint @:7.6.1,8:8.4.1 
          required because py-tensorflow-probability depends on bazel@3.2:6 
            required because py-tensorflow-probability requested explicitly 
        required because conflict is triggered when os=tahoe 
          required because py-tensorflow-probability depends on bazel@3.2:6 
            required because py-tensorflow-probability requested explicitly 
     4. bazel: 'os=tahoe' conflicts with '@:7.6,8:8.4.1'
        required because conflict is triggered when os=tahoe 
          required because py-tensorflow-probability depends on bazel@3.2:6 
            required because py-tensorflow-probability requested explicitly 
        required because conflict constraint @:7.6,8:8.4.1 
          required because py-tensorflow-probability depends on bazel@3.2:6 
            required because py-tensorflow-probability requested explicitly 

@adamjstewart adamjstewart self-assigned this Jan 12, 2026
@zackgalbreath
Copy link
Contributor

running another manual pipeline for this branch because the mac stacks are still double disabled

@eugeneswalker
Copy link
Contributor Author

All the darwin jobs, irrespective of stack, are failing with these type of errors:

...
$ echo "$REMOTE_SCRIPT_CLONE_SPACK_SHA256  clone_spack.sh" | sha256sum -c
usage: sha256sum [-bctwz] [files ...]
Running after_script 00:00
Uploading artifacts for failed job 00:00
Uploading artifacts...
Runtime platform                                    arch=arm64 os=darwin pid=60250 revision=374d34fd version=17.6.0
WARNING: jobs_scratch_dir/ml-darwin-aarch64-mps/logs: no matching files. Ensure that the artifact path is relative to the working directory (/Users/gitlab-runner-1/builds/t1_4RcQ4y/0/spack/spack-packages) 
WARNING: jobs_scratch_dir/ml-darwin-aarch64-mps/reproduction: no matching files. Ensure that the artifact path is relative to the working directory (/Users/gitlab-runner-1/builds/t1_4RcQ4y/0/spack/spack-packages) 
WARNING: jobs_scratch_dir/ml-darwin-aarch64-mps/tests: no matching files. Ensure that the artifact path is relative to the working directory (/Users/gitlab-runner-1/builds/t1_4RcQ4y/0/spack/spack-packages) 
WARNING: jobs_scratch_dir/ml-darwin-aarch64-mps/user_data: no matching files. Ensure that the artifact path is relative to the working directory (/Users/gitlab-runner-1/builds/t1_4RcQ4y/0/spack/spack-packages) 
ERROR: No files to upload            
...
$ time ./spack/bin/spack python ${SPACK_CI_PACKAGES_ROOT}/.ci/gitlab/scripts/common/aggregate_package_logs.spack.py --prefix /home/software/spack:${CI_PROJECT_DIR}/opt/spack --log install_times.json ${SPACK_ARTIFACTS_ROOT}/user_data/install_times.json || true
bash: line 215: ./spack/bin/spack: No such file or directory
real	0m0.002s
user	0m0.000s
sys	0m0.001s
Uploading artifacts for failed job 00:00
Uploading artifacts...
Runtime platform                                    arch=arm64 os=darwin pid=51748 revision=374d34fd version=17.6.0
...

@eugeneswalker
Copy link
Contributor Author

Looks like we need to rebase and make sure this fix is included:

Doing that next...

@eugeneswalker eugeneswalker force-pushed the ci-enable-darwin-again branch from 93d260d to 15e35fc Compare January 12, 2026 22:40
@eugeneswalker
Copy link
Contributor Author

@zackgalbreath Can you re-trigger manual run with the right variables?

@zackgalbreath
Copy link
Contributor

@zackgalbreath Can you re-trigger manual run with the right variables?

Didn't work unfortunately. It looks like the problem is that the macOS version of sha256sum -c can't read from stdin. I'll push a fix shortly...

@zackgalbreath
Copy link
Contributor

We got a little bit further that time. New error is:

KeyError: 'tahoe'

at lib/spack/spack/platforms/darwin.py#L47

@zackgalbreath
Copy link
Contributor

KeyError: 'tahoe'

This looks like the relevant issue: spack/spack#51589

@zackgalbreath
Copy link
Contributor

Upon closer inspection, I think the issue is that rho is tagged as tahoe but appears to be still running sequoia. For now I've paused the rho runners and kicked off another pipeline.

@zackgalbreath
Copy link
Contributor

We're getting a lot further this time, but I noticed some errors related to gfortran.

@eugeneswalker
Copy link
Contributor Author

We're getting a lot further this time, but I noticed some errors related to gfortran.

The issue seems to be that the GNU Fortran compiler version we are using is incompatible with apple clang post upgrade. I am working on this, will post update here.

zackgalbreath
zackgalbreath previously approved these changes Jan 28, 2026
@zackgalbreath zackgalbreath enabled auto-merge (squash) January 28, 2026 22:08
@trws
Copy link
Contributor

trws commented Jan 29, 2026

How does this play with #3144? They both seem to touch a lot of the same spots.

@adamjstewart
Copy link
Member

@spackbot run pipeline

@spackbot-app
Copy link

spackbot-app bot commented Jan 29, 2026

I've started that pipeline for you!

@adamjstewart
Copy link
Member

How does this play with #3144? They both seem to touch a lot of the same spots.

#3144 should be merged after this PR. This PR re-enables CI, #3144 reorganizes that CI.

@adamjstewart
Copy link
Member

#3119 is now merged, you should be able to rebase to get CI finally passing!

@zackgalbreath zackgalbreath force-pushed the ci-enable-darwin-again branch from a024ff2 to b85a5b5 Compare January 30, 2026 12:18
adamjstewart
adamjstewart previously approved these changes Jan 30, 2026
adamjstewart
adamjstewart previously approved these changes Jan 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci Issues related to Continuous Integration

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants