Skip to content

Commit

Permalink
Add instructions about downloading from release branch (#177)
Browse files Browse the repository at this point in the history
* Handle a null schema in the sink connector.

* Fix concurrent access on iteration of synchronized set.

* Reduce the ackIds iteration synchronization to the shortest period needed

* Clear list of ackIds as soon as acks are sent. This will make acks best effort, but also prevents us from forgetting to send some acks if a poll happens while there is an outstanding ack request

* Better error message when verifySubscription fails.

* When an exception happens on pull, just return an empty list to poll.

* Ensure partition number is non-negative in CPS source connector.

* Shade out the guava library so the connector can work with a newer version of Kafka that depends on a newer version of the guava library.

* Shade out the guava library so the connector can work with a newer version of Kafka that depends on a newer version of the guava library.

* Update versions of gRPC libraries used by Kafka Connector. This should hopefully fix issue #120.

* Remove bad oracle jdk7 environment from Travis runs.

* Minor formatting fixes.

* Add a new config property to the sink connector, metadata.publish. When this property is true, the following attributes are added to a message published to Cloud Pub/Sub via the sink connector:

kafka.topic: The Kafka topic on which the message was originally published
kafka.partition: The Kafka partition to which the message was originally published
kafka.offset: The offset of the message in the Kafka topic/partition
kafka.timestamp: The timestamp that Kafka associated with the message

* Calculate message size at the end rather than along the way.

* Remove the temporary variables for Kafka attributes

* Periodically recreate GRPC publishers and subscribers in order to avoid GOAWAY errors.

* Formatting/syntactic fixes

* Switch sink connector to client library.

* Remove commented-out code

* Fix java 7 compatibility

* Add parameters for timeouts for publishes to Cloud Pub/Sub.

* Fix handling of optional struct fields and error message for missing fields, which would show up as a message about unsupported nested types.

* Add test case for optional field where the value is present.

* Set maximum inbound message size on source connector to ensure it is possible to receive a message up to largest Pub/Sub message size (10MB).

* Added instructions to Kafka Connector about downloading from a release.
  • Loading branch information
kamalaboulhosn authored Sep 19, 2018
1 parent 6d32f60 commit 462a6b2
Showing 1 changed file with 13 additions and 3 deletions.
16 changes: 13 additions & 3 deletions kafka-connector/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,21 @@ source connector (to copy messages from Cloud Pub/Sub to Kafka).

These instructions assume you are using [Maven](https://maven.apache.org/).

1. Clone the repository, ensuring to do so recursively to pick up submodules:
1. If you want to build the connector from head, clone the repository, ensuring
to do so recursively to pick up submodules:

`git clone --recursive https://github.com/GoogleCloudPlatform/cloud-pubsub-kafka`
`git clone --recursive https://github.com/GoogleCloudPlatform/pubsub`

2. Make the jar that contains the connector:
If you wish to build from a released version of the connector, download it
from the [Releases section](https://github.com/GoogleCloudPlatform/pubsub/releases)
in GitHub.

2. Unzip the source code if downloaded from the release version.

3. Go into the kafka-connector directory in the cloned repo or downloaded
release.

4. Make the jar that contains the connector:

`mvn package`

Expand Down

0 comments on commit 462a6b2

Please sign in to comment.