Skip to content

Commit

Permalink
add rootCaCertificate option to SplunkIO
Browse files Browse the repository at this point in the history
fix test

Add error reporting for BatchConverter match failure (apache#24022)

* add error reporting for BatchConverters

* Test pytorch

* Finish up torch tests

* yapf

* yapf

* Remove else

Update automation to use Go 1.19 (apache#24175)

Co-authored-by: lostluck <[email protected]>

Fix broken json for notebook (apache#24183)

Using Teardown context instead of deprecated finalize (apache#24180)

* Using Teardown context instead of deprecated finalize

* making function public

Co-authored-by: Scott Strong <[email protected]>

[Python]Support pipe operator as Union (PEP -604) (apache#24106)

Fixes apache#21972

Add custom inference function support to the PyTorch model handler (apache#24062)

* Initial type def and function signature

* [Draft] Add custom inference fn support to Pytorch Model Handler

* Formatting

* Split out default

* Remove Keyed version for testing

* Move device optimization

* Make default available for import, add to test classes

* Remove incorrect default from keyed test

* Keyed impl

* Fix device arg

* custom inference test

* formatting

* Add helpers to define custom inference functions using model methods

* Trailing whitespace

* Unit tests

* Fix incorrect getattr syntax

* Type typo

* Fix docstring

* Fix keyed helper, add basic generate route

* Modify generate() to be different than forward()

* formatting

* Remove extra generate() def

Strip FGAC database role from changestreams metadata requests (apache#24177)

Co-authored-by: Doug Judd <[email protected]>

Updated README of Interactive Beam

Removed deprecated cache_dir runner param in favor of the cache_root global option.

Minor update

Fix arguments to checkState in BatchViewOverrides

Re-use serializable pipeline options when already available (apache#24192)

Fix Python PostCommit Example CustomPTransformIT on portable (apache#24159)

* Fix Python PostCommit Examples on portable

* Fix custom_ptransform pipeline options gets modified

* Specify flinkConfDir

revert upgrade to go 1.19 for action unit tests (apache#24189)

Use only ValueProviders in SpannerConfig (apache#24156)

[Tour of Beam] [Frontend] Content tree URLs (apache#23776)

* Content tree navigation (apache#23593)

Unit content navigation (apache#23593)

Update URL on node click (apache#23593)

Active unit color (apache#23593)

removeListener in unit (apache#23593)

First unit is opened on group title click (apache#23593)

WIP by Alexey Inkin (apache#23593)

selectedUnitColor (apache#23593)

Unit borderRadius (apache#23593)

RegExp todo (apache#23593)

added referenced collection package to remove warning (apache#23593)

small refinement (apache#23593)

expand on group tap, padding, openNode (apache#23593)

group expansion bug fix (apache#23593)

selected & unselected progress indicators (apache#23593)

* AnimatedBuilders instead of StatefulWidgets in unit & group (apache#23593)

* fixed _getNodeAncestors (apache#23593)

* get sdkId (apache#23593)

* addressing comments (apache#23593)

* sdkId getter & StatelessExpansionTile (apache#23593)

* expand & collapse group (apache#23593)

* StatelessExpansionTile (apache#23593)

* license (apache#23593)

* ValueChanged and ValueKey in StatelessExpansionTile (apache#23593)

Co-authored-by: darkhan.nausharipov <[email protected]>
Co-authored-by: Alexey Inkin <[email protected]>

refs: issue-24196, fix broken hyperlink

Add a reference to Java RunInference example

Python TextIO Performance Test (apache#23951)

* Python TextIO Performance Test

* Add filebasedio_perf_test module for unified test framework for
  Python file-based IOs

* Fix MetricsReader publishes metrics duplicately if more than one
  load test declared. This is because MetricsReader.publishers was
  static class variable

* Fix pylint

* Distribute Python performance tests random time at a day instead of all at 3PM

* Add information about length conversion

Fix PythonLint (apache#24219)

Bump loader-utils from 1.4.1 to 1.4.2 in /sdks/python/apache_beam/runners/interactive/extensions/apache-beam-jupyterlab-sidepanel (apache#24191)

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

Temporary update Python RC validation job

updates

updates

Uses _all to follow alias/datastreams when estimating index size

Fixes apache#24117

Adds test for following aliases when estimating index size

Bump github.com/aws/aws-sdk-go-v2/config from 1.18.0 to 1.18.1 in /sdks (apache#24222)

Bumps [github.com/aws/aws-sdk-go-v2/config](https://github.com/aws/aws-sdk-go-v2) from 1.18.0 to 1.18.1.
- [Release notes](https://github.com/aws/aws-sdk-go-v2/releases)
- [Changelog](https://github.com/aws/aws-sdk-go-v2/blob/main/CHANGELOG.md)
- [Commits](aws/aws-sdk-go-v2@config/v1.18.0...config/v1.18.1)

---
updated-dependencies:
- dependency-name: github.com/aws/aws-sdk-go-v2/config
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>

Signed-off-by: dependabot[bot] <[email protected]>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

Add enableGzipHttpCompression option to SplunkIO (apache#24197)

add enableBatchLogs as SplunkIO option

spotless

fix issue with setEnableBatchLogs
  • Loading branch information
andreigurau committed Nov 17, 2022
1 parent fee73b1 commit ef37e9b
Show file tree
Hide file tree
Showing 92 changed files with 1,975 additions and 312 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,6 @@
"SPARK_VERSIONS": ["2", "3"]
},
"GoTestProperties": {
"SUPPORTED_VERSIONS": ["1.18"]
"SUPPORTED_VERSIONS": ["1.19"]
}
}
2 changes: 1 addition & 1 deletion .github/workflows/build_playground_backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ jobs:
name: Build Playground Backend App
runs-on: ubuntu-latest
env:
GO_VERSION: 1.18.0
GO_VERSION: 1.19.3
BEAM_VERSION: 2.40.0
TERRAFORM_VERSION: 1.0.9
STAND_SUFFIX: ''
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/go_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,4 +74,4 @@ jobs:
echo -e "Please address Staticcheck warnings before checking in changes\n"
echo -e "Staticcheck Warnings:\n"
echo -e "$RESULTS" && exit 1
fi
fi
4 changes: 2 additions & 2 deletions .github/workflows/local_env_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
with:
go-version: '1.18'
go-version: '1.19'
- name: "Installing local env dependencies"
run: "sudo ./local-env-setup.sh"
id: local_env_install_ubuntu
Expand All @@ -57,7 +57,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
with:
go-version: '1.18'
go-version: '1.19'
- name: "Installing local env dependencies"
run: "./local-env-setup.sh"
id: local_env_install_mac
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ PhraseTriggeringPostCommitBuilder.postCommitJob(
executeJob(delegate, bqio_read_test)
}

CronJobBuilder.cronJob('beam_PerformanceTests_BiqQueryIO_Read_Python', 'H 15 * * *', this) {
CronJobBuilder.cronJob('beam_PerformanceTests_BiqQueryIO_Read_Python', 'H H * * *', this) {
executeJob(delegate, bqio_read_test)
}

Expand All @@ -103,6 +103,6 @@ PhraseTriggeringPostCommitBuilder.postCommitJob(
executeJob(delegate, bqio_write_test)
}

CronJobBuilder.cronJob('beam_PerformanceTests_BiqQueryIO_Write_Python_Batch', 'H 15 * * *', this) {
CronJobBuilder.cronJob('beam_PerformanceTests_BiqQueryIO_Write_Python_Batch', 'H H * * *', this) {
executeJob(delegate, bqio_write_test)
}
81 changes: 81 additions & 0 deletions .test-infra/jenkins/job_PerformanceTests_FileBasedIO_Python.groovy
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

import CommonJobProperties as common
import LoadTestsBuilder as loadTestsBuilder
import InfluxDBCredentialsHelper

def now = new Date().format("MMddHHmmss", TimeZone.getTimeZone('UTC'))

def jobs = [
[
name : 'beam_PerformanceTests_TextIOIT_Python',
description : 'Runs performance tests for Python TextIOIT',
test : 'apache_beam.io.filebasedio_perf_test',
githubTitle : 'Python TextIO Performance Test',
githubTriggerPhrase: 'Run Python TextIO Performance Test',
pipelineOptions : [
publish_to_big_query : true,
metrics_dataset : 'beam_performance',
metrics_table : 'python_textio_1GB_results',
influx_measurement : 'python_textio_1GB_results',
test_class : 'TextIOPerfTest',
input_options : '\'{' +
'"num_records": 25000000,' +
'"key_size": 9,' +
'"value_size": 21}\'',
dataset_size : '1050000000',
num_workers : '5',
autoscaling_algorithm: 'NONE'
]
]
]

jobs.findAll {
it.name in [
'beam_PerformanceTests_TextIOIT_Python',
]
}.forEach { testJob -> createGCSFileBasedIOITTestJob(testJob) }

private void createGCSFileBasedIOITTestJob(testJob) {
job(testJob.name) {
description(testJob.description)
common.setTopLevelMainJobProperties(delegate)
common.enablePhraseTriggeringFromPullRequest(delegate, testJob.githubTitle, testJob.githubTriggerPhrase)
common.setAutoJob(delegate, 'H H * * *')
InfluxDBCredentialsHelper.useCredentials(delegate)
additionalPipelineArgs = [
influxDatabase: InfluxDBCredentialsHelper.InfluxDBDatabaseName,
influxHost: InfluxDBCredentialsHelper.InfluxDBHostUrl,
]
testJob.pipelineOptions.putAll(additionalPipelineArgs)

def dataflowSpecificOptions = [
runner : 'DataflowRunner',
project : 'apache-beam-testing',
region : 'us-central1',
temp_location : 'gs://temp-storage-for-perf-tests/',
filename_prefix : "gs://temp-storage-for-perf-tests/${testJob.name}/\${BUILD_ID}/",
]

Map allPipelineOptions = dataflowSpecificOptions << testJob.pipelineOptions

loadTestsBuilder.loadTest(
delegate, testJob.name, CommonTestProperties.Runner.DATAFLOW, CommonTestProperties.SDK.PYTHON, allPipelineOptions, testJob.test)
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,6 @@ PhraseTriggeringPostCommitBuilder.postCommitJob(
executeJob(delegate, psio_test)
}

CronJobBuilder.cronJob('beam_PerformanceTests_PubsubIOIT_Python_Streaming', 'H 15 * * *', this) {
CronJobBuilder.cronJob('beam_PerformanceTests_PubsubIOIT_Python_Streaming', 'H H * * *', this) {
executeJob(delegate, psio_test)
}
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ PhraseTriggeringPostCommitBuilder.postCommitJob(
executeJob(delegate, spannerio_read_test_2gb)
}

CronJobBuilder.cronJob('beam_PerformanceTests_SpannerIO_Read_2GB_Python', 'H 15 * * *', this) {
CronJobBuilder.cronJob('beam_PerformanceTests_SpannerIO_Read_2GB_Python', 'H H * * *', this) {
executeJob(delegate, spannerio_read_test_2gb)
}

Expand All @@ -105,6 +105,6 @@ PhraseTriggeringPostCommitBuilder.postCommitJob(
executeJob(delegate, spannerio_write_test_2gb)
}

CronJobBuilder.cronJob('beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch', 'H 15 * * *', this) {
CronJobBuilder.cronJob('beam_PerformanceTests_SpannerIO_Write_2GB_Python_Batch', 'H H * * *', this) {
executeJob(delegate, spannerio_write_test_2gb)
}
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ PostcommitJobBuilder.postCommitJob('beam_PostCommit_Python_Examples_Flink',
gradle {
rootBuildScriptDir(commonJobProperties.checkoutDir)
tasks(":sdks:python:test-suites:portable:flinkExamplesPostCommit")
switches("-PflinkConfDir=$WORKSPACE/src/runners/flink/src/test/resources")
commonJobProperties.setGradleSwitches(delegate)
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -482,6 +482,128 @@
"align": false,
"alignLevel": null
}
},
{
"aliasColors": {},
"bars": false,
"cacheTimeout": null,
"dashLength": 10,
"dashes": false,
"datasource": "BeamInfluxDB",
"fill": 1,
"fillGradient": 0,
"gridPos": {
"h": 9,
"w": 12,
"x": 12,
"y": 9
},
"hiddenSeries": false,
"id": 6,
"interval": "24h",
"legend": {
"avg": false,
"current": false,
"max": false,
"min": false,
"show": false,
"total": false,
"values": false
},
"lines": true,
"linewidth": 2,
"links": [],
"nullPointMode": "connected",
"options": {
"dataLinks": []
},
"percentage": false,
"pluginVersion": "6.7.2",
"pointradius": 2,
"points": true,
"renderer": "flot",
"seriesOverrides": [],
"spaceLength": 10,
"stack": false,
"steppedLine": false,
"targets": [
{
"alias": "$tag_metric",
"groupBy": [
{
"params": [
"$__interval"
],
"type": "time"
}
],
"measurement": "",
"orderByTime": "ASC",
"policy": "default",
"query": "SELECT mean(\"value\") FROM \"python_textio_1GB_results\" WHERE \"metric\" = 'read_runtime' OR \"metric\" = 'write_runtime' AND $timeFilter GROUP BY time($__interval), \"metric\"",
"rawQuery": true,
"refId": "A",
"resultFormat": "time_series",
"select": [
[
{
"params": [
"value"
],
"type": "field"
},
{
"params": [],
"type": "mean"
}
]
],
"tags": []
}
],
"thresholds": [],
"timeFrom": null,
"timeRegions": [],
"timeShift": null,
"title": "TextIO | GCS | 1 GB",
"tooltip": {
"shared": true,
"sort": 0,
"value_type": "individual"
},
"transparent": true,
"type": "graph",
"xaxis": {
"buckets": null,
"mode": "time",
"name": null,
"show": true,
"values": []
},
"yaxes": [
{
"$$hashKey": "object:403",
"format": "s",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": true
},
{
"$$hashKey": "object:404",
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": true
}
],
"yaxis": {
"align": false,
"alignLevel": null
}
}
],
"schemaVersion": 22,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1972,7 +1972,7 @@ class BeamModulePlugin implements Plugin<Project> {
def goRootDir = "${project.rootDir}/sdks/go"

// This sets the whole project Go version.
project.ext.goVersion = "go1.18.1"
project.ext.goVersion = "go1.19.3"

// Minor TODO: Figure out if we can pull out the GOCMD env variable after goPrepare script
// completion, and avoid this GOBIN substitution.
Expand Down
4 changes: 2 additions & 2 deletions examples/notebooks/beam-ml/dataframe_api_preprocessing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1535,7 +1535,7 @@
"numerical_cols = beam_df.select_dtypes(include=np.number).columns.tolist()\n",
"categorical_cols = list(set(beam_df.columns) - set(numerical_cols))"
]
}
},
{
"cell_type": "code",
"execution_count": null,
Expand Down Expand Up @@ -3492,4 +3492,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
20 changes: 15 additions & 5 deletions learning/tour-of-beam/frontend/lib/models/content_tree.dart
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,29 @@

import '../repositories/models/get_content_tree_response.dart';
import 'module.dart';
import 'node.dart';
import 'parent_node.dart';

class ContentTreeModel {
final String sdkId;
class ContentTreeModel extends ParentNodeModel {
final List<ModuleModel> modules;

String get sdkId => id;

@override
List<NodeModel> get nodes => modules;

const ContentTreeModel({
required this.sdkId,
required super.id,
required this.modules,
});
}) : super(
parent: null,
title: '',
nodes: modules,
);

ContentTreeModel.fromResponse(GetContentTreeResponse response)
: this(
sdkId: response.sdkId,
id: response.sdkId,
modules: response.modules
.map(ModuleModel.fromResponse)
.toList(growable: false),
Expand Down
29 changes: 21 additions & 8 deletions learning/tour-of-beam/frontend/lib/models/group.dart
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,28 @@ import 'parent_node.dart';
class GroupModel extends ParentNodeModel {
const GroupModel({
required super.id,
required super.title,
required super.nodes,
required super.parent,
required super.title,
});

GroupModel.fromResponse(GroupResponseModel group)
: super(
id: group.id,
title: group.title,
nodes:
group.nodes.map(NodeModel.fromResponse).toList(growable: false),
);
factory GroupModel.fromResponse(
GroupResponseModel groupResponse,
ParentNodeModel parent,
) {
final group = GroupModel(
id: groupResponse.id,
nodes: [],
parent: parent,
title: groupResponse.title,
);

group.nodes.addAll(
groupResponse.nodes.map<NodeModel>(
(node) => NodeModel.fromResponse(node, group),
),
);

return group;
}
}
Loading

0 comments on commit ef37e9b

Please sign in to comment.