AWS GLUE: Visual ETL From Kinesis into Snowflake


Hi there,

I am trying to establish ETL processing from our existing AWS Kinesis data stream to snowflake. We are currently using KCL which is extremely difficult to maintain as we do not have the means to manage the solution and infrastructure around it.

Glue seems like a promising approach. Using an S3 source to SQL ETL to Snowflake works perfectly. Swapping out the S3 source to our Kinesis stream results in a few issues. After creating the streaming schema for our data stream it still results in the following issue attached below.

It seems I also cannot preview the data/schema from my data stream from the visual ETL. Any assistance would be appreciated.

Traceback (most recent call last):
File "/opt/amazon/lib/python3.7/site-packages/awsglue/", line 20, in from_catalog return self._glue_context.create_data_frame_from_catalog(db, table_name, redshift_tmp_dir, transformation_ctx, push_down_predicate, additional_options, catalog_id, **kwargs)

File "/opt/amazon/lib/python3.7/site-packages/awsglue/", line 219, in create_data_frame_from_catalog source = StreamingDataSource(self._ssql_ctx.getCatalogSource(db, table_name, redshift_tmp_dir, transformation_ctx,

File "/opt/amazon/spark/python/lib/", line 1321, in __call__ return_value = get_return_value(

File "/opt/amazon/lib/python3.7/site-packages/pyspark/sql/", line 190, in deco return f(*a, **kw)

File "/opt/amazon/spark/python/lib/", line 326, in get_return_value raise Py4JJavaError(

py4j.protocol.Py4JJavaError: An error occurred while calling o90.getCatalogSource. : org.antlr.v4.runtime.misc.ParseCancellationException: line 1:0 no viable alternative at input '<EOF>' at at org.antlr.v4.runtime.ProxyErrorListener.syntaxError( at org.antlr.v4.runtime.Parser.notifyErrorListeners( at org.antlr.v4.runtime.DefaultErrorStrategy.reportNoViableAlternative( at org.antlr.v4.runtime.DefaultErrorStrategy.reportError( at at at at$anonfun$getFieldsFromColumns$1(DataCatalogWrapper.scala:615) at at at$(DataCatalogWrapper.scala:614) at at at$(DataCatalogWrapper.scala:648) at at at$(DataCatalogWrapper.scala:631) at at at$(DataCatalogWrapper.scala:1023) at at$anonfun$getTable$1(DataCatalogWrapper.scala:288) at scala.util.Try$.apply(Try.scala:213) at at at at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke( at sun.reflect.DelegatingMethodAccessorImpl.invoke( at java.lang.reflect.Method.invoke( at py4j.reflection.MethodInvoker.invoke( at py4j.reflection.ReflectionEngine.invoke( at py4j.Gateway.invoke( at py4j.commands.AbstractCommand.invokeMethod( at py4j.commands.CallCommand.execute( at at

  • You seem to to be using a catalog table but you also say are creating your "streaming schema"? It would help more details about how you are creating the DataFrame. I would say there is something wrong with that table schema

asked 2 months ago492 views
1 Answer

Hello Prashaan,

To answer your question, we require details that are non-public information. I kindly request you to open a support case with AWS using the following link.

profile pictureAWS
answered 2 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions