Skip to content

Commit

Permalink
Merge pull request #24747: [Website] update links from absolute to re…
Browse files Browse the repository at this point in the history
…lative in md files
  • Loading branch information
aromanenko-dev authored Jan 2, 2023
2 parents 3ee5b86 + 587e41e commit 0f423dd
Show file tree
Hide file tree
Showing 99 changed files with 204 additions and 204 deletions.
8 changes: 4 additions & 4 deletions website/www/site/content/en/blog/ApachePlayground.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@ limitations under the License.
* Displays pipeline execution graph (DAG)
* Code editor to modify examples or try your own custom pipeline with a Direct Runner
* Code editor with code highlighting, flexible layout, color schemes, and other features to provide responsive UX in desktop browsers
* Embedding a Playground example on a web page prompts the web page readers to try the example pipeline in the Playground - e.g., [Playground Quickstart](https://beam.apache.org/get-started/try-beam-playground/) page
* Embedding a Playground example on a web page prompts the web page readers to try the example pipeline in the Playground - e.g., [Playground Quickstart](/get-started/try-beam-playground/) page


### **What’s Next**
* Try examples in [Apache Beam Playground](https://play.beam.apache.org/)
* Submit your feedback using “Enjoying Playground?” in Apache Beam Playground or via [this form](https://docs.google.com/forms/d/e/1FAIpQLSd5_5XeOwwW2yjEVHUXmiBad8Lxk-4OtNcgG45pbyAZzd4EbA/viewform?usp=pp_url)
* Join the Beam [users@](https://beam.apache.org/community/contact-us) mailing list
* Contribute to the Apache Beam Playground codebase by following a few steps in this [Contribution Guide](https://beam.apache.org/contribute)
* Join the Beam [users@](/community/contact-us) mailing list
* Contribute to the Apache Beam Playground codebase by following a few steps in this [Contribution Guide](/contribute)

Please [reach out](https://beam.apache.org/community/contact-us) if you have any feedback or encounter any issues!
Please [reach out](/community/contact-us) if you have any feedback or encounter any issues!
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ class GenerateSequenceTable extends BaseBeamTable implements Serializable {

Now that we have implemented the two basic classes (a `BaseBeamTable`, and a
`TableProvider`), we can start playing with them. After building the
[SQL CLI](https://beam.apache.org/documentation/dsls/sql/shell/), we
[SQL CLI](/documentation/dsls/sql/shell/), we
can now perform selections on the table:

```
Expand Down
4 changes: 2 additions & 2 deletions website/www/site/content/en/blog/beam-2.21.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,9 +46,9 @@ for example usage.
for that function.

More details can be found in
[Ensuring Python Type Safety](https://beam.apache.org/documentation/sdks/python-type-safety/)
[Ensuring Python Type Safety](/documentation/sdks/python-type-safety/)
and the Python SDK Typing Changes
[blog post](https://beam.apache.org/blog/python-typing/).
[blog post](/blog/python-typing/).

* Java SDK: Introducing the concept of options in Beam Schema’s. These options add extra
context to fields and schemas. This replaces the current Beam metadata that is present
Expand Down
4 changes: 2 additions & 2 deletions website/www/site/content/en/blog/beam-2.25.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,9 @@ For more information on changes in 2.25.0, check out the

* Support for repeatable fields in JSON decoder for `ReadFromBigQuery` added. (Python) ([BEAM-10524](https://issues.apache.org/jira/browse/BEAM-10524))
* Added an opt-in, performance-driven runtime type checking system for the Python SDK ([BEAM-10549](https://issues.apache.org/jira/browse/BEAM-10549)).
More details will be in an upcoming [blog post](https://beam.apache.org/blog/python-performance-runtime-type-checking/index.html).
More details will be in an upcoming [blog post](/blog/python-performance-runtime-type-checking/index.html).
* Added support for Python 3 type annotations on PTransforms using typed PCollections ([BEAM-10258](https://issues.apache.org/jira/browse/BEAM-10258)).
More details will be in an upcoming [blog post](https://beam.apache.org/blog/python-improved-annotations/index.html).
More details will be in an upcoming [blog post](/blog/python-improved-annotations/index.html).
* Improved the Interactive Beam API where recording streaming jobs now start a long running background recording job. Running ib.show() or ib.collect() samples from the recording ([BEAM-10603](https://issues.apache.org/jira/browse/BEAM-10603)).
* In Interactive Beam, ib.show() and ib.collect() now have "n" and "duration" as parameters. These mean read only up to "n" elements and up to "duration" seconds of data read from the recording ([BEAM-10603](https://issues.apache.org/jira/browse/BEAM-10603)).
* Initial preview of [Dataframes](https://s.apache.org/simpler-python-pipelines-2020#slide=id.g905ac9257b_1_21) support.
Expand Down
6 changes: 3 additions & 3 deletions website/www/site/content/en/blog/beam-2.32.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,9 +46,9 @@ For more information on changes in 2.32.0, check out the [detailed release notes

## Highlights
* The [Beam DataFrame
API](https://beam.apache.org/documentation/dsls/dataframes/overview/) is no
API](/documentation/dsls/dataframes/overview/) is no
longer experimental! We've spent the time since the [2.26.0 preview
announcement](https://beam.apache.org/blog/dataframe-api-preview-available/)
announcement](/blog/dataframe-api-preview-available/)
implementing the most frequently used pandas operations
([BEAM-9547](https://issues.apache.org/jira/browse/BEAM-9547)), improving
[documentation](https://beam.apache.org/releases/pydoc/current/apache_beam.dataframe.html)
Expand All @@ -62,7 +62,7 @@ For more information on changes in 2.32.0, check out the [detailed release notes
Leaving experimental just means that we now have high confidence in the API
and recommend its use for production workloads. We will continue to improve
the API, guided by your
[feedback](https://beam.apache.org/community/contact-us/).
[feedback](/community/contact-us/).


## I/Os
Expand Down
2 changes: 1 addition & 1 deletion website/www/site/content/en/blog/beam-2.38.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ See the [download page](/get-started/downloads/#2380-2022-04-20) for this releas
For more information on changes in 2.38.0 check out the [detailed release notes](https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12319527&version=12351169).

## I/Os
* Introduce projection pushdown optimizer to the Java SDK ([BEAM-12976](https://issues.apache.org/jira/browse/BEAM-12976)). The optimizer currently only works on the [BigQuery Storage API](https://beam.apache.org/documentation/io/built-in/google-bigquery/#storage-api), but more I/Os will be added in future releases. If you encounter a bug with the optimizer, please file a JIRA and disable the optimizer using pipeline option `--experiments=disable_projection_pushdown`.
* Introduce projection pushdown optimizer to the Java SDK ([BEAM-12976](https://issues.apache.org/jira/browse/BEAM-12976)). The optimizer currently only works on the [BigQuery Storage API](/documentation/io/built-in/google-bigquery/#storage-api), but more I/Os will be added in future releases. If you encounter a bug with the optimizer, please file a JIRA and disable the optimizer using pipeline option `--experiments=disable_projection_pushdown`.
* A new IO for Neo4j graph databases was added. ([BEAM-1857](https://issues.apache.org/jira/browse/BEAM-1857)) It has the ability to update nodes and relationships using UNWIND statements and to read data using cypher statements with parameters.
* `amazon-web-services2` has reached feature parity and is finally recommended over the earlier `amazon-web-services` and `kinesis` modules (Java). These will be deprecated in one of the next releases ([BEAM-13174](https://issues.apache.org/jira/browse/BEAM-13174)).
* Long outstanding write support for `Kinesis` was added ([BEAM-13175](https://issues.apache.org/jira/browse/BEAM-13175)).
Expand Down
2 changes: 1 addition & 1 deletion website/www/site/content/en/blog/beam-2.42.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ For more information on changes in 2.42.0, check out the [detailed release notes

* Added support for stateful DoFns to the Go SDK.
* Added support for [Batched
DoFns](https://beam.apache.org/documentation/programming-guide/#batched-dofns)
DoFns](/documentation/programming-guide/#batched-dofns)
to the Python SDK.

## New Features / Improvements
Expand Down
2 changes: 1 addition & 1 deletion website/www/site/content/en/blog/beam-2.8.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ For more information on changes in 2.8.0, check out the

### Portability

* [Python on Flink MVP](https://beam.apache.org/roadmap/portability/#python-on-flink) completed.
* [Python on Flink MVP](/roadmap/portability/#python-on-flink) completed.

### I/Os

Expand Down
4 changes: 2 additions & 2 deletions website/www/site/content/en/blog/beam-katas-kotlin-release.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Today, we are happy to announce a new addition to the Beam Katas family: Kotlin!

<img src="/images/blog/beam-katas-kotlin-release/beam-and-kotlin.png" alt="Apache Beam and Kotlin Shaking Hands" height="330" width="800" >

You may remember [a post from last year](https://beam.apache.org/blog/beam-kata-release) that informed everyone of the wonderful Beam Katas available on [Stepik](https://stepik.org)
You may remember [a post from last year](/blog/beam-kata-release) that informed everyone of the wonderful Beam Katas available on [Stepik](https://stepik.org)
for learning more about writing Apache Beam applications, working with its various APIs and programming model
hands-on, all from the comfort of your favorite IDEs. As of today, you can now work through all of the progressive
exercises to learn about the fundamentals of Beam in Kotlin.
Expand All @@ -41,7 +41,7 @@ as one of the most beloved programming languages in the annual Stack Overflow De
just our word for it.

The relationship between Apache Beam and Kotlin isn't a new one. You can find examples scattered across the web
of engineering teams embracing the two technologies including [a series of samples announced on this very blog](https://beam.apache.org/blog/beam-kotlin/).
of engineering teams embracing the two technologies including [a series of samples announced on this very blog](/blog/beam-kotlin/).
If you are new to Beam or are an experienced veteran looking for a change of pace, we'd encourage you to give
Kotlin a try.

Expand Down
8 changes: 4 additions & 4 deletions website/www/site/content/en/blog/beam-sql-with-notebooks.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ limitations under the License.

## Intro

[Beam SQL](https://beam.apache.org/documentation/dsls/sql/overview/) allows a
[Beam SQL](/documentation/dsls/sql/overview/) allows a
Beam user to query PCollections with SQL statements.
[Interactive Beam](https://github.com/apache/beam/tree/master/sdks/python/apache_beam/runners/interactive#interactive-beam)
provides an integration between Apache Beam and
Expand Down Expand Up @@ -174,7 +174,7 @@ element_type like `BeamSchema_...(id: int32, str: str, flt: float64)`.
PCollection because the `beam_sql` magic always implicitly creates a pipeline to
execute your SQL query. To hold the elements with each field's type info, Beam
automatically creates a
[schema](https://beam.apache.org/documentation/programming-guide/#what-is-a-schema)
[schema](/documentation/programming-guide/#what-is-a-schema)
as the `element_type` for the created PCollection. You will learn more about
schema-aware PCollections later.

Expand Down Expand Up @@ -221,7 +221,7 @@ always check the content of a PCollection by invoking `ib.show(pcoll_name)` or
The `beam_sql` magic provides the flexibility to seamlessly mix SQL and non-SQL
Beam statements to build pipelines and even run them on Dataflow. However, each
PCollection queried by Beam SQL needs to have a
[schema](https://beam.apache.org/documentation/programming-guide/#what-is-a-schema).
[schema](/documentation/programming-guide/#what-is-a-schema).
For the `beam_sql` magic, it’s recommended to use `typing.NamedTuple` when a
schema is desired. You can go through the below example to learn more details
about schema-aware PCollections.
Expand Down Expand Up @@ -788,7 +788,7 @@ you to learn Beam SQL and mix Beam SQL into prototyping and productionizing (
e.g., to Dataflow) your Beam pipelines with minimum setups.

For more details about the Beam SQL syntax, check out the Beam Calcite SQL
[compatibility](https://beam.apache.org/documentation/dsls/sql/calcite/overview/)
[compatibility](/documentation/dsls/sql/calcite/overview/)
and the Apache Calcite SQL
[syntax](https://calcite.apache.org/docs/reference.html).

2 changes: 1 addition & 1 deletion website/www/site/content/en/blog/beam-starter-projects.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,6 +72,6 @@ Here are the starter projects; you can choose your favorite language:
* **[Kotlin]** [github.com/apache/beam-starter-kotlin](https://github.com/apache/beam-starter-kotlin) – Adapted to idiomatic Kotlin
* **[Scala]** [github.com/apache/beam-starter-scala](https://github.com/apache/beam-starter-scala) – Coming soon!

We have updated the [Java quickstart](https://beam.apache.org/get-started/quickstart/java/) to use the new starter project, and we're working on updating the Python and Go quickstarts as well.
We have updated the [Java quickstart](/get-started/quickstart/java/) to use the new starter project, and we're working on updating the Python and Go quickstarts as well.

We hope you find this useful. Feedback and contributions are always welcome! So feel free to create a GitHub issue, or open a Pull Request to any of the starter project repositories.
4 changes: 2 additions & 2 deletions website/www/site/content/en/blog/beam-summit-europe-2019.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ and [Stockholm](https://www.meetup.com/Apache-Beam-Stockholm/events/260634514) h

Keep an eye out for a meetup in [Paris](https://www.meetup.com/Paris-Apache-Beam-Meetup).

If you are interested in starting your own meetup, feel free [to reach out](https://beam.apache.org/community/contact-us)! Good places to start include our Slack channel, the dev and user mailing lists, or the Apache Beam Twitter.
If you are interested in starting your own meetup, feel free [to reach out](/community/contact-us)! Good places to start include our Slack channel, the dev and user mailing lists, or the Apache Beam Twitter.

Even if you can’t travel to these meetups, you can stay informed on the happenings of the community. The talks and sessions from previous conferences and meetups are archived on the [Apache Beam YouTube channel](https://www.youtube.com/c/ApacheBeamYT). If you want your session added to the channel, don’t hesitate to get in touch!

Expand All @@ -63,7 +63,7 @@ The first summit of the year will be held in Berlin:

<img src="https://img.evbuc.com/https%3A%2F%2Fcdn.evbuc.com%2Fimages%2F58635346%2F70962106775%2F1%2Foriginal.20190317-212619?w=800&auto=compress&rect=0%2C115%2C2666%2C1333&s=2680f5036dcad9177b027cce026c0224" alt="Beam Summit Europe Banner" >

You can find more info on the [website](https://beamsummit.org) and read about the inaugural edition of the Beam Summit Europe [here](https://beam.apache.org/blog/2018/10/31/beam-summit-aftermath.html). At these summits, you have the opportunity to meet with other Apache Beam creators and users, get expert advice, learn from the speaker sessions, and participate in workshops.
You can find more info on the [website](https://beamsummit.org) and read about the inaugural edition of the Beam Summit Europe [here](/blog/2018/10/31/beam-summit-aftermath.html). At these summits, you have the opportunity to meet with other Apache Beam creators and users, get expert advice, learn from the speaker sessions, and participate in workshops.

We strongly encourage you to get involved again this year! You can participate in the following ways for the upcoming summit in Europe:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ limitations under the License.

We're excited to announce that a preview of the Beam Python SDK's new DataFrame
API is now available in [Beam
2.26.0](https://beam.apache.org/blog/beam-2.26.0/). Much like `SqlTransform`
2.26.0](/blog/beam-2.26.0/). Much like `SqlTransform`
([Java](https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/extensions/sql/SqlTransform.html),
[Python](https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.sql.html#apache_beam.transforms.sql.SqlTransform)),
the DataFrame API gives Beam users a way to express complex
Expand Down Expand Up @@ -76,7 +76,7 @@ as much as possible.

## DataFrames as a DSL
You may already be aware of [Beam
SQL](https://beam.apache.org/documentation/dsls/sql/overview/), which is
SQL](/documentation/dsls/sql/overview/), which is
a Domain-Specific Language (DSL) built with Beam's Java SDK. SQL is
considered a DSL because it's possible to express a full pipeline, including IOs
and complex operations, entirely with SQL. 
Expand All @@ -91,7 +91,7 @@ implementations (`pd.read_{csv,parquet,...}` and `pd.DataFrame.to_{csv,parquet,.

Like SQL, it's also possible to embed the DataFrame API into a larger pipeline
by using
[schemas](https://beam.apache.org/documentation/programming-guide/#what-is-a-schema).
[schemas](/documentation/programming-guide/#what-is-a-schema).
A schema-aware PCollection can be converted to a DataFrame, processed, and the
result converted back to another schema-aware PCollection. For example, if you
wanted to use traditional Beam IOs rather than one of the DataFrame IOs you
Expand Down
12 changes: 6 additions & 6 deletions website/www/site/content/en/blog/go-2.40.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,15 +29,15 @@ some of the biggest changes coming with this important release!
2.40 marks the release of one of our most anticipated feature sets yet:
native streaming Go pipelines. This includes adding support for:

- [Self Checkpointing](https://beam.apache.org/documentation/programming-guide/#user-initiated-checkpoint)
- [Watermark Estimation](https://beam.apache.org/documentation/programming-guide/#watermark-estimation)
- [Pipeline Drain/Truncation](https://beam.apache.org/documentation/programming-guide/#truncating-during-drain)
- [Bundle Finalization](https://beam.apache.org/documentation/programming-guide/#bundle-finalization) (added in 2.39)
- [Self Checkpointing](/documentation/programming-guide/#user-initiated-checkpoint)
- [Watermark Estimation](/documentation/programming-guide/#watermark-estimation)
- [Pipeline Drain/Truncation](/documentation/programming-guide/#truncating-during-drain)
- [Bundle Finalization](/documentation/programming-guide/#bundle-finalization) (added in 2.39)

With all of these features, it is now possible to write your own streaming
pipeline source DoFns in Go without relying on cross-language transforms
from Java or Python. We encourage you to try out all of these new features
in your streaming pipelines! The [programming guide](https://beam.apache.org/documentation/programming-guide/#splittable-dofns)
in your streaming pipelines! The [programming guide](/documentation/programming-guide/#splittable-dofns)
has additional information on getting started with native Go streaming DoFns.

# Generic Registration (Make Your Pipelines 3x Faster)
Expand All @@ -61,7 +61,7 @@ gains, check out the [registration doc page](https://pkg.go.dev/github.com/apach

Moving forward, we remain focused on improving the streaming experience and
leveraging generics to improve the SDK. Specific improvements we are considering
include adding [State & Timers](https://beam.apache.org/documentation/programming-guide/#state-and-timers)
include adding [State & Timers](/documentation/programming-guide/#state-and-timers)
support, introducing a Go expansion service so that Go DoFns can be used in other
languages, and wrapping more Java and Python IOs so that they can be easily used
in Go. As always, please let us know what changes you would like to see by
Expand Down
4 changes: 2 additions & 2 deletions website/www/site/content/en/blog/gsoc-19.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,8 @@ I wanted to explore Data Engineering, so for GSoC, I wanted to work on a project
I had already read the [Streaming Systems book](http://streamingsystems.net/). So, I had an idea of the concepts that Beam is built on, but had never actually used Beam.
Before actually submitting a proposal, I went through a bunch of resources to make sure I had a concrete understanding of Beam.
I read the [Streaming 101](https://www.oreilly.com/ideas/the-world-beyond-batch-streaming-101) and [Streaming 102](https://www.oreilly.com/ideas/the-world-beyond-batch-streaming-102) blogs by Tyler Akidau. They are the perfect introduction to Beam’s unified model for Batch and Streaming.
In addition, I watched all Beam talks on YouTube. You can find them on the [Beam Website](https://beam.apache.org/get-started/resources/videos-and-podcasts/).
Beam has really good documentation. The [Programming Guide](https://beam.apache.org/documentation/programming-guide/) lays out all of Beam’s concepts really well. [Beam’s execution model](https://beam.apache.org/documentation/runtime/model) is also documented well and is a must-read to understand how Beam processes data.
In addition, I watched all Beam talks on YouTube. You can find them on the [Beam Website](/get-started/resources/videos-and-podcasts/).
Beam has really good documentation. The [Programming Guide](/documentation/programming-guide/) lays out all of Beam’s concepts really well. [Beam’s execution model](/documentation/runtime/model) is also documented well and is a must-read to understand how Beam processes data.
[waitingforcode.com](https://www.waitingforcode.com/apache-beam) also has good blog posts about Beam concepts.
To get a better sense of the Beam codebase, I played around with it and worked on some PRs to understand Beam better and got familiar with the test suite and workflows.

Expand Down
Loading

0 comments on commit 0f423dd

Please sign in to comment.