Skip to content

Commit

Permalink
feat: enable generation with postprocessing of multiple service versi…
Browse files Browse the repository at this point in the history
…ons (#2342)

* use local synthtool and owlbot

* remove unused files

* remove more unused files

* remove cache files in owlbot

* use java 11 for it

* remove kokoro files

* use glob in owlbot entrypoint

* remove unused files

* do not do post-process IT on mac

* concise entrypoint logic

* cleanup i

* cleanup ii

* cleanup iii

* cleanup iv

* remove templates

* remove protos folder

* remove synthtool

* connect image to owlbot entrypoint

* simplify synthtool docker run command

* install synthtool locally

* install synthtool only once

* use virtualenvs to run python scripts

* install pyenv in action

* remove jar from history

* download google-java-format

* fix pyenv init

* attempt to fix pyenv installation in gh action

* fix manual pyenv installation

* install pyenv in profile

* install pyenv in bashrc as well

* use bash shell explicitly in gh action

* install pyenv in same step as IT

* do not restart shell

* set pyenv path manually

* install pyenv in its own step

* propagate environment variables to other steps

* fix global env var setup

* remove wrong env settings

* explain usage of pyenv in README

* simplify pyenv setup

* add comment to owlbot entrypoint

* rename destination_path to preprocessed_libraries_path

* infer scripts_root in postprocess_library.sh

* use temporary folder for preprocess step

* use owlbot files from workspace

* get rid of output_folder argument

* use common temp dir to clone synthtool into

* lock synthtool to a specific commitish

* fix file transfer

* fix owl-bot-staging unpacking

* remove unnecessary workspace variable

* rename workspace to postprocessing_target

* remove owlbot sha logic

* remove repository_root variable

* cleanup

* correct pyenv comment

* clean temp sources folder on each run

* safety checks for get_proto_path_from_preprocessed_sources

* fix integration test

* disable compute and asset/v1p2beta1 temporarily

they have changes in googleapis that have not been reflected yet in
google-cloud-java

* fix unit tests

* correct comment

* do not install docker for macos

* fix owlbot files check

* fix license headers

* remove unnecessary owlbot_sha

* add explanation on why are there no macos + postprocess ITs

* use `fmt:format` instead of google formatter

* clean templates

* remove more unnecessary elements

* add README entry explaining owlbot maintenance

* remove unnecessary java format version

* wip: create generate_composed_library.sh

* initial implementation of generate_composed_library.sh

* partial implementation of scripts

* wip: fixes after main changes

* partial implementation of scripts ii

* correct arg parsing

* fixes in python utils

* fix generate_library call

* correct argument preparation

* add gapic generatior version arg check

* call generate_library successfully

* fix postprocessing step in generate_composed

* IT working for all libraries

* add unit tests

* fix comments in generate_composed_lib

* remove commented code

* prepare tests without postprocessing

* restore test functions

* fix rename utility for building owlbot folder

* correct linter problems

* install realpath on macos

* install utils also in unit tests

* include utilities.sh in showcase scripts

* comment py_util

* Update library_generation/generate_composed_library.sh

Co-authored-by: Tomo Suzuki <[email protected]>

* add comment explaining usage of coreutils in macos workflow

* explain that entrypoint.sh can be used for HW libraries

* use real world example for generate_composed_library.sh

* improve versions.txt explanation in generate_composed_library

* add return type in python utils

* fix versions file inference

* use ggj 2.29 to fix ITs temporarily

* disable asset due to licence year mismatch

* improve commment in generate_composed_library

* restore latest generator

* remove wrong WORKSPACE comment

* improve composed_library input comment

* Update library_generation/utilities.py

Co-authored-by: Blake Li <[email protected]>

* remove postprocessing logic in generate_library

* add generated_composed_library.py

* use python script in IT

* iterative fixes

* ignore python cache

* iterative fixes ii

* Update library_generation/generate_composed_library.py

Co-authored-by: Joe Wang <[email protected]>

* Update library_generation/utilities.py

Co-authored-by: Joe Wang <[email protected]>

* Update library_generation/generate_composed_library.py

Co-authored-by: Tomo Suzuki <[email protected]>

* use underscores in configuration yaml

* initial model for gapic config yaml

* add requirements file

* introduce yaml parsing logic

* move parse logic to GenerationConfig

* adapt composed_library script

* fixes i

* set IT to dummy mode

* pass BUILD parse utils to production utils

* fixes ii - constructor calls and composed_library arguments

* fix destination path for partial generations

* adapt IT to process individual libraries

* use proper versions in configuration yaml

* add rest of the libraries to integration test

* add library_type to config yaml

* reference to parent directory in compare_poms

* handle script failures

* use library-specific googleapis_commitish

* fix protobuf version

* install python reqs in Github Action

* fix python version path logic

* add python unit tests

* remove obsolete py_util tests

* add python unit tests in workflow

* correct type hinting for Library

* fix comments

* set enable_postprocessing input to main.py to boolean

* add explanation on library_type

* remove old proto_path_list

* fix comments in IT

* remove WORKSPACE file usage

* add IT configuration yaml for java-bigtable

* fix config yaml for bigtable

* finish tests for HW bigtable

* install python in gh action, lint fix

* update ggj to 2.32.0

* fix showcase test

* remove commented line

* use owlbot cli sha from config yaml

* use python version from config yaml

* use synthtool commitish from config yaml

* add repository_path option

* make destination_path required

* add typing

* use python version from config yaml

* correct workflow indentation

* fix workflow syntax

* use repository_path when postprocessing

* correct runs-on in workflow

* use full path to repo in workflow

* add debug output in workflow

* decompose steps to compute python version

* checout code in workflow

* fix function name in workflow

* use full path to obtain python version from yaml

* use correct path in python version workflow

* use set-output to share python version

* fix set-output call

* fix output setting in workflow

* correct python version letter case

* remove pyenv setup

* fix repository_path

* fix speech proto_path

* do not wipe out google-cloud-java in IT

* ensure correct version of python compares the poms

* fix diff check

* add return type for GenerationCOnfig.from_yaml

* add missing python unit tests

* use default values for enable_postprocessing

* use is_monorepo var, constant for google-coud-java

* fixes for local run

* compute monorepo variable heuristically

* update generation configs

* remove python version

* rename Library to LibraryConfig

* rename GAPIC to GapicConfig

* remove quotes from grpc version

* move ClientInputs to model folder

* parse BUILD file using ClientInputs

* add unit tests for ClientInputs

* fix CLientInputs typo

* fix synthtool version

* disable compute test

* fix unit test

* fix compute test

* fix unit tests

* remove BUILD parsing shell utilities

* update monorepo special treatment comment

* fix typo in shebang

---------

Co-authored-by: Tomo Suzuki <[email protected]>
Co-authored-by: Blake Li <[email protected]>
Co-authored-by: Joe Wang <[email protected]>
  • Loading branch information
4 people authored Jan 29, 2024
1 parent a34d897 commit 363e35e
Show file tree
Hide file tree
Showing 32 changed files with 1,107 additions and 618 deletions.
43 changes: 38 additions & 5 deletions .github/workflows/verify_library_generation.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
cache: maven
- uses: actions/setup-python@v4
with:
python-version: '3.11'
python-version: 3.11
- name: install pyenv
shell: bash
run: |
Expand All @@ -36,10 +36,23 @@ jobs:
export PATH="$PYENV_ROOT/bin:$PATH"
echo "PYENV_ROOT=${PYENV_ROOT}" >> $GITHUB_ENV
echo "PATH=${PATH}" >> $GITHUB_ENV
# init pyenv
eval "$(pyenv init --path)"
eval "$(pyenv init -)"
set +ex
- name: install python dependencies
shell: bash
run: |
set -ex
pushd library_generation
pip install -r requirements.in
popd
- name: install utils (macos)
if: matrix.os == 'macos-12'
shell: bash
run: |
brew update --preinstall
# we need the `realpath` command to be available
brew install coreutils
- name: install docker (ubuntu)
if: matrix.os == 'ubuntu-22.04'
shell: bash
Expand Down Expand Up @@ -69,10 +82,30 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- name: Run unit tests
- name: install utils (macos)
if: matrix.os == 'macos-12'
shell: bash
run: |
brew update --preinstall
brew install coreutils
- uses: actions/setup-python@v4
with:
python-version: 3.11
- name: install python dependencies
shell: bash
run: |
set -ex
pushd library_generation
pip install -r requirements.in
popd
- name: Run shell unit tests
run: |
set -x
library_generation/test/generate_library_unit_tests.sh
- name: Run python unit tests
run: |
set -x
python -m unittest library_generation/test/unit_tests.py
lint:
runs-on: ubuntu-22.04
steps:
Expand Down
1 change: 1 addition & 0 deletions library_generation/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__pycache__/
141 changes: 141 additions & 0 deletions library_generation/generate_composed_library.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,141 @@
"""
This script allows generation of libraries that are composed of more than one
service version. It is achieved by calling `generate_library.sh` without
postprocessing for all service versions and then calling
postprocess_library.sh at the end, once all libraries are ready.
Prerequisites
- Needs a folder named `output` in current working directory. This folder
is automatically detected by `generate_library.sh` and this script ensures it
contains the necessary folders and files, specifically:
- A "google" folder found in the googleapis/googleapis repository
- A "grafeas" folder found in the googleapis/googleapis repository
Note: googleapis repo is found in https://github.com/googleapis/googleapis.
"""

import click
import utilities as util
import os
import sys
import subprocess
import json
from model.GenerationConfig import GenerationConfig
from model.LibraryConfig import LibraryConfig
from model.ClientInputs import parse as parse_build_file

script_dir = os.path.dirname(os.path.realpath(__file__))

"""
Main function in charge of generating libraries composed of more than one
service or service version.
Arguments
- config: a GenerationConfig object representing a parsed configuration
yaml
- library: a LibraryConfig object contained inside config, passed here for
convenience and to prevent all libraries to be processed
- enable_postprocessing: true if postprocessing should be done on the generated
libraries
- repository_path: path to the repository where the generated files will be
sent. If not specified, it will default to the one defined in the configuration yaml
and will be downloaded. The versions file will be inferred from this folder
"""
def generate_composed_library(
config: GenerationConfig,
library: LibraryConfig,
repository_path: str,
enable_postprocessing: bool = True,
) -> None:
output_folder = util.sh_util('get_output_folder')

print(f'output_folder: {output_folder}')
print('library: ', library)
os.makedirs(output_folder, exist_ok=True)

googleapis_commitish = config.googleapis_commitish
if library.googleapis_commitish is not None:
googleapis_commitish = library.googleapis_commitish
print('using library-specific googleapis commitish: ' + googleapis_commitish)
else:
print('using common googleapis_commitish')

print('removing old googleapis folders and files')
util.delete_if_exists(f'{output_folder}/google')
util.delete_if_exists(f'{output_folder}/grafeas')

print('downloading googleapis')
util.sh_util(f'download_googleapis_files_and_folders "{output_folder}" "{googleapis_commitish}"')

is_monorepo = len(config.libraries) > 1

base_arguments = []
base_arguments += util.create_argument('gapic_generator_version', config)
base_arguments += util.create_argument('grpc_version', config)
base_arguments += util.create_argument('protobuf_version', config)

library_name = f'java-{library.api_shortname}'
library_path = None

versions_file = ''
if is_monorepo:
print('this is a monorepo library')
destination_path = config.destination_path + '/' + library_name
library_folder = destination_path.split('/')[-1]
if repository_path is None:
print(f'sparse_cloning monorepo with {library_name}')
repository_path = f'{output_folder}/{config.destination_path}'
clone_out = util.sh_util(f'sparse_clone "https://github.com/googleapis/{MONOREPO_NAME}.git" "{library_folder} google-cloud-pom-parent google-cloud-jar-parent versions.txt .github"', cwd=output_folder)
print(clone_out)
library_path = f'{repository_path}/{library_name}'
versions_file = f'{repository_path}/versions.txt'
else:
print('this is a HW library')
destination_path = library_name
if repository_path is None:
repository_path = f'{output_folder}/{destination_path}'
util.delete_if_exists(f'{output_folder}/{destination_path}')
clone_out = util.sh_util(f'git clone "https://github.com/googleapis/{destination_path}.git"', cwd=output_folder)
print(clone_out)
library_path = f'{repository_path}'
versions_file = f'{repository_path}/versions.txt'

owlbot_cli_source_folder = util.sh_util('mktemp -d')
for gapic in library.gapic_configs:

effective_arguments = list(base_arguments)
effective_arguments += util.create_argument('proto_path', gapic)

build_file_folder = f'{output_folder}/{gapic.proto_path}'
print(f'build_file_folder: {build_file_folder}')
client_inputs = parse_build_file(build_file_folder, gapic.proto_path)
effective_arguments += [
'--proto_only', client_inputs.proto_only,
'--gapic_additional_protos', client_inputs.additional_protos,
'--transport', client_inputs.transport,
'--rest_numeric_enums', client_inputs.rest_numeric_enum,
'--gapic_yaml', client_inputs.gapic_yaml,
'--service_config', client_inputs.service_config,
'--service_yaml', client_inputs.service_yaml,
'--include_samples', client_inputs.include_samples,
]
service_version = gapic.proto_path.split('/')[-1]
temp_destination_path = f'java-{library.api_shortname}-{service_version}'
effective_arguments += [ '--destination_path', temp_destination_path ]
print('arguments: ')
print(effective_arguments)
print(f'Generating library from {gapic.proto_path} to {destination_path}...')
util.run_process_and_print_output(['bash', '-x', f'{script_dir}/generate_library.sh',
*effective_arguments], 'Library generation')


if enable_postprocessing:
util.sh_util(f'build_owlbot_cli_source_folder "{library_path}"'
+ f' "{owlbot_cli_source_folder}" "{output_folder}/{temp_destination_path}"'
+ f' "{gapic.proto_path}"',
cwd=output_folder)

if enable_postprocessing:
# call postprocess library
util.run_process_and_print_output([f'{script_dir}/postprocess_library.sh',
f'{library_path}', '', versions_file, owlbot_cli_source_folder,
config.owlbot_cli_image, config.synthtool_commitish, str(is_monorepo).lower()], 'Library postprocessing')

50 changes: 8 additions & 42 deletions library_generation/generate_library.sh
Original file line number Diff line number Diff line change
Expand Up @@ -60,18 +60,10 @@ case $key in
include_samples="$2"
shift
;;
--enable_postprocessing)
enable_postprocessing="$2"
shift
;;
--os_architecture)
os_architecture="$2"
shift
;;
--versions_file)
versions_file="$2"
shift
;;
*)
echo "Invalid option: [$1]"
exit 1
Expand All @@ -85,6 +77,11 @@ script_dir=$(dirname "$(readlink -f "$0")")
source "${script_dir}"/utilities.sh
output_folder="$(get_output_folder)"

if [ -z "${gapic_generator_version}" ]; then
echo 'missing required argument --gapic_generator_version'
exit 1
fi

if [ -z "${protobuf_version}" ]; then
protobuf_version=$(get_protobuf_version "${gapic_generator_version}")
fi
Expand Down Expand Up @@ -125,10 +122,6 @@ if [ -z "${include_samples}" ]; then
include_samples="true"
fi

if [ -z "$enable_postprocessing" ]; then
enable_postprocessing="true"
fi

if [ -z "${os_architecture}" ]; then
os_architecture=$(detect_os_architecture)
fi
Expand Down Expand Up @@ -305,34 +298,7 @@ popd # output_folder
pushd "${temp_destination_path}"
rm -rf java_gapic_srcjar java_gapic_srcjar_raw.srcjar.zip java_grpc.jar java_proto.jar temp-codegen.srcjar
popd # destination path
##################### Section 5 #####################
# post-processing
#####################################################
if [ "${enable_postprocessing}" != "true" ];
then
echo "post processing is disabled"
cp -r ${temp_destination_path}/* "${output_folder}/${destination_path}"
rm -rdf "${temp_destination_path}"
exit 0
fi
if [ -z "${versions_file}" ];then
echo "no versions.txt argument provided. Please provide one in order to enable post-processing"
exit 1
fi
workspace="${output_folder}/workspace"
if [ -d "${workspace}" ]; then
rm -rdf "${workspace}"
fi

mkdir -p "${workspace}"

# if destination_path is not empty, it will be used as a starting workspace for
# postprocessing
if [[ $(find "${output_folder}/${destination_path}" -mindepth 1 -maxdepth 1 -type d,f | wc -l) -gt 0 ]];then
workspace="${output_folder}/${destination_path}"
fi

bash -x "${script_dir}/postprocess_library.sh" "${workspace}" \
"${temp_destination_path}" \
"${versions_file}"

cp -r ${temp_destination_path}/* "${output_folder}/${destination_path}"
rm -rdf "${temp_destination_path}"
exit 0
77 changes: 77 additions & 0 deletions library_generation/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
"""
Parses a config yaml and generates libraries via generate_composed_library.py
"""

import click
from generate_composed_library import generate_composed_library
from typing import Dict
from model.GenerationConfig import GenerationConfig
from collections.abc import Sequence
from absl import app

@click.group(invoke_without_command=False)
@click.pass_context
@click.version_option(message="%(version)s")
def main(ctx):
pass

@main.command()
@click.option(
"--generation-config-yaml",
required=True,
type=str,
help="""
Path to generation_config.yaml that contains the metadata about library generation
"""
)
@click.option(
"--enable-postprocessing",
required=False,
default=True,
type=bool,
help="""
Path to repository where generated files will be merged into, via owlbot copy-code.
Specifying this option enables postprocessing
"""
)
@click.option(
"--target-library-api-shortname",
required=False,
type=str,
help="""
If specified, only the `library` with api_shortname = target-library-api-shortname will
be generated. If not specified, all libraries in the configuration yaml will be generated
"""
)
@click.option(
"--repository-path",
required=False,
type=str,
help="""
If specified, the generated files will be sent to this location. If not specified, the
repository will be pulled into output_folder and move the generated files there
"""
)
def generate_from_yaml(
generation_config_yaml: str,
enable_postprocessing: bool,
target_library_api_shortname: str,
repository_path: str
) -> None:
config = GenerationConfig.from_yaml(generation_config_yaml)
target_libraries = config.libraries
if target_library_api_shortname is not None:
target_libraries = [library for library in config.libraries
if library.api_shortname == target_library_api_shortname]
for library in target_libraries:
print(f'generating library {library.api_shortname}')
generate_composed_library(
config, library, repository_path, enable_postprocessing
)





if __name__ == "__main__":
main()
Loading

0 comments on commit 363e35e

Please sign in to comment.