Skip to content

Suite of clients and APIs that users and applications can use to access grid services (metapackage)

License

Notifications You must be signed in to change notification settings

EGI-Federation/ui-metapackage

 
 

Repository files navigation

User Interface (UI) meta-package

In order to interact with High Throughput Compute (HTC) resources, you should have access to a User Interface, often referred to as a UI. This software environment will provide all the tools required to interact with the different middleware, as different sites can be using different Computing Element (CE), such as HTCondorCE and ARC-CE (CREAM is a legacy software stack that is not officially supported).

The UI contains a suite of clients and APIs that users and applications can use to access High Throughput Compute services.

It will also install the IGTF distribution.

The package relies on packages available in the following repositories:

Installing and using the UI

Once the UI will be installed, you will need to set it up so to be able to interact with the resources available to a given Virtual Organisation (VO).

Deploying the UI

The UI is available as a package in the UMD software distribution, but it will also require additional software and configuration.

In order to help with deploying the UI, different solutions are possible:

  • Deploying the UI manually, using the packages available from UMD repositories. Once the repositories are configured by install the umd-release package, install the ui meta-package, and configure the system to interact with the VOMS servers of the VO to be used.

    # Install EPEL repository
    $ dnf install -y epel-release
    # Install UMD repositories, look for available UMD release on https://repository.egi.eu/
    # FIXME: As of 2024-08, fall back on WLCG + upstreams repositories in place of UMD repo
    $ dnf install -y https://linuxsoft.cern.ch/wlcg/el9/x86_64/wlcg-repo-1.0.0-1.el9.noarch.rpm
    $ dnf install -y https://research.cs.wisc.edu/htcondor/repo/23.x/htcondor-release-current.el9.noarch.rpm
    $ dnf install -y https://ecsft.cern.ch/dist/cvmfs/cvmfs-release/cvmfs-release-latest.noarch.rpm
    $ dnf config-manager --set-enabled crb
    $ dnf localinstall -y ui-*.rpm
  • Some Ansible roles are available in the EGI Federation GitHub organisation, mainly ansible-role-ui that should be used together with ansible-role-VOMS-client, providing software and material required for the authentication and authorisation, and ansible-role-umd configuring the software repositories from where all the software will be installed.

  • The repository ui-deployment provides a terraform-based deployment allowing to deploy a User Interface (UI) in a Cloud Compute virtual machine. This integrated deployment is based on the Ansible modules, and should be adjusted to your environment and needs.

Manually configuring for a specific VO

If you have installed the ui meta-package manually, from UMD repository, you need to configure the support of the VO(s) you want to use on the UI.

The VOMS configuration pages contains the information required to configure your UI so that it can interact with the VOMS server for your VO.

  • As an example with dteam VO, you can find the VOMS server address in the dteam VO ID card.

  • Then looking at dteam VOMS' Configuration page, you can create:

    • /etc/vomsdir/<vo-name>/<voms-hostname>.lsc, adjusting the file name according to the VO.

      • For dteam, the VOMS server is voms2.hellasgrid.gr, so the file would be named /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc with the content for the LSC configuration.

        /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
        /C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016
        
    • /etc/vomses/<vo-name>-<voms-hosntame> file, adjusting the file name according to the VO

      • For dteam, the VOMS server is voms2.hellasgrid.gr, so the file would be named /etc/vomses/dteam-voms2.hellasgrid.gr with the content of the VOMSES string.

        "dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"
        

If you cannot edit content in /etc/vomses and /etc/grid-security/vomsdir, you can respectively use ~/.glite/vomses and ~/.glite/vomsdir. You may have to export X509_VOMSES and X509_VOMS_DIR in your shell, as documented on CERN's twiki:

$ export X509_VOMSES=~/.glite/vomses
$ export X509_VOMS_DIR=~/.glite/vomsdir

Setting up a UI using Ansible

If you are using Ansible, the following roles can be used:

The repository ui-deployment provides a terraform based deployment allowing to deploy a User Interface (UI) in a Cloud Compute virtual machine. This integrated deployment is based on the Ansible modules, and should be adjusted to your environment and needs.

Building packages

Building the RPM

The required build dependencies are:

  • rpm-build
  • make
  • rsync
# Checkout tag to be packaged
$ git clone https://github.com/EGI-Federation/ui-metapackage.git
$ cd ui-metapackage
$ git checkout X.X.X
# Building in a container
$ docker run --rm -v $(pwd):/source -it almalinux:9
[root@bc96d4c5a232 /]# dnf install -y rpm-build make rsync rpmlint systemd-rpm-macros
[root@bc96d4c5a232 /]# cd /source && make rpm
[root@bc96d4c5a232 /]# rpmlint --file .rpmlint.ini build/RPMS/x86_64/*.rpm

The RPM will be available into the build/RPMS directory.

Preparing a release

  • Prepare a changelog from the last version, including contributors' names
  • Prepare a PR with
    • Updating version and changelog in ui.spec
    • Updating version and changelog in CHANGELOG
  • Once the PR has been merged, publish a new release using GitHub web interface
    • Suffix the tag name to be created with v, like v1.0.0
    • Packages will be built using GitHub Actions and attached to the release page

History

This work started under the EGEE project. This is now hosted on GitHub, and maintained by the EGI Federation.

About

Suite of clients and APIs that users and applications can use to access grid services (metapackage)

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Makefile 100.0%