Hot questions for Using Cassandra in json

Question:

I am trying to use the datastax java driver and retrieve the row as a JSON.

I do the classic SELECT JSON * from myTable WHERE id=1 and this returns a Json formatted string on CQL.

e.g { "uuid" : "12324567-...." }

This works.

Now when, I try to do the same use the Java driver, I use (in scala)

val resultSet = session.execute(queryString)

I pick up one row from this result set using: "resultSet.one()". This has the string I need, but how do I pick this up?

Experiment: resultSet.one().getColumnDefinitions.toString

Prints: Columns[ [json] (varchar) ]

Experiment: resultSet.one().toString() Prints: Row[{"uuid": "3ce19e07-2280-4b31-9475-992bda608e70"}] <- String I need

How do I pick up a simple string that represents the JSON in my program, without trying to split the strings above ?


Answer:

As noted in the the Cassandra documentation:

The results for SELECT JSON will only include a single column named [json]. This column will contain the same JSON-encoded map representation of a row that is used for INSERT JSON.

In order to access the JSON value of the returned row, you need to use one of the getString methods defined on the Row class to get the value of this column either by index or by name:

Row row = resultSet.one();
String json1 = row.getString(0);
String json2 = row.getString("[json]");

Question:

Environment :

  • java 7

  • cassandra 2.1.2 running on my local dev workstation in one simple node on Windows 8.1

  • driver cassandra-driver-core-2.1.2

  • running environment : apache karaf 2.3.8

I try to insert a row by giving a json in a column defined by a simple type (coordinates with x and y).

I build this java statement :

Statement statement = QueryBuilder
.insertInto("myKeySpace", "myTable")
.value("myKeyColumn", "myKeyValue")
.value("coordinates", "{\"x\":10.4,\"y\":20.3}");

And when executing this :

mySession.execute(statement);

I've got the following error (full stack trace is at the end of this message) :

com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]

[In the following CQL statements, I have anonymized the column names, it is possible that the quotes are wrong, but the problem is in the java statement above]

My cassandra table and type :

CREATE TABLE IF NOT EXISTS "myTable" ( 
    "myKeyColumn" text,
    coordinates FROZEN<coordinates>,
    PRIMARY KEY ("myKeyColumn") 
);

CREATE TYPE IF NOT EXISTS coordinates (
      x double,
      y double
);

And when executing this following query in Datastax DevCenter, this works fine :

INSERT INTO "myTable"("myKeyColumn","coordinates")
VALUES ('myKeyValue',{"x":10.4,"y":20.3});

Any help would be welcome ! :-)

Full stack trace :

2014-11-27 11:38:07,532 | WARN  | tp1271566160-230 | ServletHandler                   | pse.jetty.servlet.ServletHandler  563 | 135 - org.eclipse.jetty.aggregate.jetty-all-server - 8.1.15.v20140411 | /cxf/rest/myProject/add
java.lang.RuntimeException: org.apache.cxf.interceptor.Fault: com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]
    at org.apache.cxf.interceptor.AbstractFaultChainInitiatorObserver.onMessage(AbstractFaultChainInitiatorObserver.java:116)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:336)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:241)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.ServletController.invokeDestination(ServletController.java:248)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:222)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.ServletController.invoke(ServletController.java:153)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.CXFNonSpringServlet.invoke(CXFNonSpringServlet.java:171)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.AbstractHTTPServlet.handleRequest(AbstractHTTPServlet.java:286)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.apache.cxf.transport.servlet.AbstractHTTPServlet.doPost(AbstractHTTPServlet.java:206)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:595)[83:org.apache.geronimo.specs.geronimo-servlet_3.0_spec:1.0]
    at org.apache.cxf.transport.servlet.AbstractHTTPServlet.service(AbstractHTTPServlet.java:262)[171:org.apache.cxf.cxf-rt-transports-http:2.7.12]
    at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:503)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.ops4j.pax.web.service.jetty.internal.HttpServiceServletHandler.doHandle(HttpServiceServletHandler.java:69)[145:org.ops4j.pax.web.pax-web-jetty:3.1.1]
    at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.ops4j.pax.web.service.jetty.internal.HttpServiceContext.doHandle(HttpServiceContext.java:240)[145:org.ops4j.pax.web.pax-web-jetty:3.1.1]
    at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:429)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.ops4j.pax.web.service.jetty.internal.JettyServerHandlerCollection.handle(JettyServerHandlerCollection.java:77)[145:org.ops4j.pax.web.pax-web-jetty:3.1.1]
    at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.Server.handle(Server.java:370)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)[135:org.eclipse.jetty.aggregate.jetty-all-server:8.1.15.v20140411]
    at java.lang.Thread.run(Thread.java:724)[:1.7.0_25]
Caused by: org.apache.cxf.interceptor.Fault: com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]
    at org.apache.cxf.service.invoker.AbstractInvoker.createFault(AbstractInvoker.java:170)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:136)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:204)[177:org.apache.cxf.cxf-rt-frontend-jaxrs:2.7.12]
    at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:101)[177:org.apache.cxf.cxf-rt-frontend-jaxrs:2.7.12]
    at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:58)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:94)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:272)[164:org.apache.cxf.cxf-api:2.7.12]
    ... 36 more
Caused by: org.mycompany.service.container.InvocationException: com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]
    at org.mycompany.service.container.interceptors.ServiceInterceptor.invoke(ServiceInterceptor.java:73)
    at org.mycompany.service.container.InvocationChain.invokeNext(InvocationChain.java:82)
    at org.mycompany.service.container.interceptors.SecurityInterceptor.invoke(SecurityInterceptor.java:84)
    at org.mycompany.service.container.InvocationChain.invokeNext(InvocationChain.java:82)
    at org.mycompany.service.container.ServiceInvocationHandler.invoke(ServiceInvocationHandler.java:66)
    at com.sun.proxy.$Proxy24.add(Unknown Source)
    at sun.reflect.GeneratedMethodAccessor74.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.7.0_25]
    at java.lang.reflect.Method.invoke(Method.java:606)[:1.7.0_25]
    at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:188)[164:org.apache.cxf.cxf-api:2.7.12]
    at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:104)[164:org.apache.cxf.cxf-api:2.7.12]
    ... 41 more
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]
    at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:35)
    at com.datastax.driver.core.DefaultResultSetFuture.extractCauseFromExecutionException(DefaultResultSetFuture.java:258)
    at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:174)
    at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:52)
    at mypackage.data.internal.mydao.add(Mydao.java:140)
    ... 51 more
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Not enough bytes to read 0th field java.nio.HeapByteBuffer[pos=0 lim=1 cap=1]
    at com.datastax.driver.core.Responses$Error.asException(Responses.java:97)[120:com.datastax.driver.core:2.1.0]
    at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:110)[120:com.datastax.driver.core:2.1.0]
    at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:235)[120:com.datastax.driver.core:2.1.0]
    at com.datastax.driver.core.RequestHandler.onSet(RequestHandler.java:367)[120:com.datastax.driver.core:2.1.0]
    at com.datastax.driver.core.Connection$Dispatcher.messageReceived(Connection.java:584)[120:com.datastax.driver.core:2.1.0]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:70)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.handler.codec.oneone.OneToOneDecoder.handleUpstream(OneToOneDecoder.java:70)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)[118:org.jboss.netty:3.9.3.Final]
    at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)[118:org.jboss.netty:3.9.3.Final]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)[:1.7.0_25]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)[:1.7.0_25]
    ... 1 more

Answer:

There is a known issue with UDTs in driver v2.1.2 that will be addressed in v2.1.3. In the meantime could you try the workaround given here:

https://datastax-oss.atlassian.net/browse/JAVA-500

as in:

Cluster.builder().withProtocolVersion(ProtocolVersion.V3)

and see if that helps

Question:

Using query select JSON * from table_name in cqlsh I can get results in JSON format. I want to do the same using Datastax Java API.

StringBuilder sb = new StringBuilder("SELECT json * FROM ").append("JavaTest.data");
String query = sb.toString();
ResultSet rs = session.execute(query);
List<Row> rows = rs.all();
String q1 = rows.toString();
System.out.println(q1);

But the result is :

[  
   Row   [  
      {  
         "id":1,
         "time":"12",
         "value":"SALAM"
      }
   ],
   Row   [  
      {  
         "id":2,
         "time":" 89",
         "value":" BYE"
      }
   ],
   Row   [  
      {  
         "id":3,
         "time":" 897",
         "value":" HelloWorld"
      }
   ]
]

that it is not in the correct JSON format. I know I can get the JSON of a row but in that way, I should use a loop to get all results in JSON format. Searching in JAVA API Docs I couldn't find any solution for this!


Answer:

you need to use following - just get JSON strings as is:

for (Row row : rs) {
    String json = row.getString(0);
    // ... do something with JSON string
}

If you want to represent them as the list of objects, then it could be easier to add square brackets before & after iteration, and put comma between JSON objects, like this:

    ResultSet rs = session.execute("select json * from test.jtest ;");
    int i = 0;
    System.out.print("[");
    for (Row row : rs) {
        if (i > 0)
            System.out.print(",");
        i++;
        String json = row.getString(0);
        System.out.print(json);
    }
    System.out.println("]");

Or you can write custom serializer for ResultSet, and put conversion task into it:

    ObjectMapper mapper = new ObjectMapper();
    SimpleModule module = new SimpleModule();
    module.addSerializer(ResultSet.class, new ResultSetSerializer());
    mapper.registerModule(module);

    rs = session.execute("select * from test.jtest ;");
    String json = mapper.writeValueAsString(rs);
    System.out.println("'" + json + "'");

This is much more complex task (you need correctly handle collections, user-defined types etc.), but you may have better control over serialization.

Full code is available at this gist. Please note that JSON serializer handles only NULL, boolean & int types - everything else is treated as string. But it's enough to understand an idea.

Question:

I am trying to use the datastax java driver and retrieve and return JSON.

ResultSet resultSet = session.execute("SELECT JSON * FROM event");

Row row = resultSet.one();
String json1 = row.getString(0);
String json2 = row.getString("[json]");

    System.out.println(resultSet.toString());
returns  ResultSet[ exhausted: false, Columns[[json](varchar)]]

At this point of time, I'm aware of the code to retrieve one row. I wish to return all rows as a json string


Answer:

At this point of time, I'm aware of the code to retrieve one row. I wish to return all rows as a json string.

Java 8 StringJoiner: https://docs.oracle.com/javase/8/docs/api/java/util/StringJoiner.html

StringJoiner jsonString = new StringJoiner(",", "[", "]");
for(Row row: resultSet.all()) {
   String json = row.getString(0);
   jsonString.add(json);
}

return jsonString.toString();

Question:

I have a Spring application which requests data from a Cassandra data base. Usually, I specify convenient POJOs that I'm using in the query methods of the repository class as data model return result.

But in my current case, I don't want a mapping into a specific model object. I want the query result as raw JSON string (or whatever as Map).

According to following documentation I can request raw JSON by using JSON keyword in the CQL query: https://docs.datastax.com/en/cql/3.3/cql/cql_using/useQueryJSON.html

And theoretically, Spring Data JPA should support simple types as query results: https://www.petrikainulainen.net/programming/spring-framework/spring-data-jpa-tutorial-introduction-to-query-methods/

@Repository public interface DataRepository extends CassandraRepository<String> {

  @Query(
     "SELECT JSON uid"
         + ", version"
         + ", timestamp"
         + ", message"
         + " FROM message_data WHERE uid=?0 ALLOW FILTERING")
  Optional<String> findById(UUID id);
}

But finally, I get a mapping error on Spring application's start up:

org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'dataController' defined in URL [jar:file:/app.jar!/BOOT-INF/classes!/.../DataController.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataRepository': Invocation of init method failed; nested exception is org.springframework.data.mapping.model.MappingException: Could not lookup mapping metadata for domain class java.lang.String
    at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:749) ~[spring-beans-4.3.13.RELEASE.jar!/:4.3.13.RELEASE]
    ...
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataRepository': Invocation of init method failed; nested exception is org.springframework.data.mapping.model.MappingException: Could not lookup mapping metadata for domain class java.lang.String
    at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1628) ~[spring-beans-4.3.13.RELEASE.jar!/:4.3.13.RELEASE]
    ...
Caused by: org.springframework.data.mapping.model.MappingException: Could not lookup mapping metadata for domain class java.lang.String
    at org.springframework.data.cassandra.repository.support.CassandraRepositoryFactory.getEntityInformation(CassandraRepositoryFactory.java:104) ~[spring-data-cassandra-1.5.9.RELEASE.jar!/:na]

What am I missing here?

SOLUTION:

According to the accepted answer following code snippet did the trick:

import com.datastax.driver.core.querybuilder.QueryBuilder;
import com.datastax.driver.core.querybuilder.Select;

...
@Autowired
private CassandraOperations cassandraTemplate;
...

private Optional<String> findById(final UUID id) {
  final Select select = QueryBuilder.select().json().from("message_data");
  select.where(QueryBuilder.eq(...))
        .and(QueryBuilder.eq("uid", QueryBuilder.raw(id.toString())));

  return Optional.ofNullable(cassandraTemplate.selectOne(select, String.class));
}

private void insert(final MessageEntity entity) {
  cassandraTemplate.insert(entity);
}

Answer:

Your DataRepository is extending CassandraRepository<String> which is a problem.

The CassandraRepository declaration goes like CassandraRepository<T,ID> where

T - the domain type the repository manages
ID - the type of the id of the entity the repository manages

In your case T would be Entity/Table class you are using for message_data table and ID would be UUID.

If you need to run the query without Repository, use cassandraTemplate:

@Autowired
private CassandraOperations cassandraTemplate;

Question:

I'm trying to read data from Cassandra table via Java + sparkSession, method should return it as a JSON.

Here is my DB:

CREATE TABLE user (
  user_id               uuid,
  email                  text, 
  first_name          text,
  last_name          text,
  user_password   text,
  created_date      timestamp, 
  updated_date     timestamp,
  PRIMARY KEY (user_id)
) WITH comment = 'List of all registered and active users';

and here is a code which should return JSON:

public String getAccountData(UUID userid) throws ClassNotFoundException, SQLException {
        SparkSession sparkSession = config.getSparkSession();
        //"SELECT user.first_name,user.last_name, user.email FROM chat.user where user.id="+userid+";");

        Account account = new Account();
        Encoder<Account> accountEncoder = Encoders.bean(Account.class);

return sparkSession
        .read()
        .format("org.apache.spark.sql.cassandra")
        .options(new HashMap<String, String>() {
            {
                put("keyspace", "chat");
                put("table", "user");
            }
        })
        .load()
        .select("first_name", "last_name", "email")
        .filter("user_id = '" + userid +"'")
        .toJSON()
        .as(accountEncoder)
        .toString();
    }

and here is my Account.java file:

package rest.account;

import java.io.Serializable;


public class Account implements Serializable {

   private String user_id;
   private String first_name;
   private String last_name;   
   private String email;

   public Account(){}

   public Account(String user_id, String first_name, String last_name, String email){
      this.user_id = user_id;
      this.first_name = first_name;
      this.last_name = last_name;
      this.email = email;
   }
   //------------------------------
   public String getId() {
      return user_id;
   }

   public void setId(String user_id) {
      this.user_id = user_id;
   }

   //------------------------------
   public String getFirstName() {
      return first_name;
   }

   public void setFirstName(String first_name) {
      this.first_name = first_name;
   }

   //------------------------------
   public String getLastName() {
          return last_name;
       }

   public void setLastName(String lastName) {
   this.last_name = last_name;
   }
   //------------------------------    
   public String getEmail() {
      return email;
   }

   public void setEmail(String email) {
      this.email = email;
   }        
}

And here is the output error:

HTTP Status 500 - org.glassfish.jersey.server.ContainerException: org.apache.spark.sql.AnalysisException: cannot resolve 'email' given input columns: [value];

type Exception report

message org.glassfish.jersey.server.ContainerException: org.apache.spark.sql.AnalysisException: cannot resolve 'email' given input columns: [value];

description The server encountered an internal error that prevented it from fulfilling this request.

exception

javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: org.apache.spark.sql.AnalysisException: cannot resolve 'email' given input columns: [value]; org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:489) org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228) org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) org.apache.catalina.filters.CorsFilter.handleNonCORS(CorsFilter.java:410) org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:169) root cause

org.glassfish.jersey.server.ContainerException: org.apache.spark.sql.AnalysisException: cannot resolve 'email' given input columns: [value]; org.glassfish.jersey.servlet.internal.ResponseWriter.rethrow(ResponseWriter.java:278) org.glassfish.jersey.servlet.internal.ResponseWriter.failure(ResponseWriter.java:260) org.glassfish.jersey.server.ServerRuntime$Responder.process(ServerRuntime.java:509) org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:334) org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) org.glassfish.jersey.internal.Errors.process(Errors.java:315) org.glassfish.jersey.internal.Errors.process(Errors.java:297) org.glassfish.jersey.internal.Errors.process(Errors.java:267) org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473) org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228) org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) org.apache.catalina.filters.CorsFilter.handleNonCORS(CorsFilter.java:410) org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:169) root cause

org.apache.spark.sql.AnalysisException: cannot resolve 'email' given input columns: [value]; org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42) org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:77) org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:74) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:301) org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69) org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:300) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:321) org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:179) org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:319) org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:298) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:298) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5$$anonfun$apply$10.apply(TreeNode.scala:338) scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246) scala.collection.MapLike$MappedValues$$anonfun$iterator$3.apply(MapLike.scala:246) scala.collection.Iterator$$anon$11.next(Iterator.scala:409) scala.collection.Iterator$class.foreach(Iterator.scala:893) scala.collection.AbstractIterator.foreach(Iterator.scala:1336) scala.collection.IterableLike$class.foreach(IterableLike.scala:72) scala.collection.IterableLike$$anon$1.foreach(IterableLike.scala:311) scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59) scala.collection.mutable.MapBuilder.$plus$plus$eq(MapBuilder.scala:25) scala.collection.TraversableViewLike$class.force(TraversableViewLike.scala:88) scala.collection.IterableLike$$anon$1.force(IterableLike.scala:311) org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:346) org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:179) org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:319) org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:298) org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionUp$1(QueryPlan.scala:190) org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$2(QueryPlan.scala:200) org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$5.apply(QueryPlan.scala:209) org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:179) org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:209) org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:74) org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:67) org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126) org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:67) org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:58) org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.resolveAndBind(ExpressionEncoder.scala:245) org.apache.spark.sql.Dataset.(Dataset.scala:210) org.apache.spark.sql.Dataset.(Dataset.scala:167) org.apache.spark.sql.Dataset$.apply(Dataset.scala:59) org.apache.spark.sql.Dataset.as(Dataset.scala:359) rest.account.AccountManager.getAccountData(AccountManager.java:58) rest.account.AccountService.getAccountData(AccountService.java:28) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) java.lang.reflect.Method.invoke(Unknown Source) org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory$1.invoke(ResourceMethodInvocationHandlerFactory.java:81) org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:144) org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:161) org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:205) org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:99) org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:389) org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:347) org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:102) org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:326) org.glassfish.jersey.internal.Errors$1.call(Errors.java:271) org.glassfish.jersey.internal.Errors$1.call(Errors.java:267) org.glassfish.jersey.internal.Errors.process(Errors.java:315) org.glassfish.jersey.internal.Errors.process(Errors.java:297) org.glassfish.jersey.internal.Errors.process(Errors.java:267) org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317) org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305) org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154) org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473) org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341) org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228) org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53) org.apache.catalina.filters.CorsFilter.handleNonCORS(CorsFilter.java:410) org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:169) note The full stack trace of the root cause is available in the Apache Tomcat/9.0.0.M13 logs.

Apache Tomcat/9.0.0.M13

If I have not this converter to JSON code then method returns, for example:

[first_name: string, last_name: string, email: string]

But not a real values like First Name, Last Name, email@email.email

I will appreciate any help!


Answer:

In your getAccountData method try below code.

return sparkSession
    .read()
    .format("org.apache.spark.sql.cassandra")
    .options(new HashMap<String, String>() {
        {
            put("keyspace", "chat");
            put("table", "user");
        }
    })
    .load()
    .select("first_name", "last_name", "email")
    .filter("user_id = '" + userid +"'")
    .toJSON()
    .first();
}

Question:

This is the code I have used for writing my program but there are errors - please give us some suggestions with the corrected code.

session.execute("INSERT INTO users JSON '{'id':'user123' , 'age':21 ,'state':'TX'}';");

The errors are directed to this one statement so I thought that its not necessary to present the whole code here.TABLE users has already been created in the cassandra database with the columns id, age and state. I could not find any proper answers for this problem anywhere, I hope my problem is solved here.


Answer:

Here is the working query and below java code where I insert it and the results

"INSERT INTO users JSON '{\"id\":888 , \"age\":21 ,\"state\":\"TX\"}'";

import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Row;
import com.datastax.driver.core.Session;

public class CasandarConnect {

public static void main(String[] args) {
    String serverIP = "127.0.0.1";
    String keyspace = "mykeyspace";

    Cluster cluster = Cluster.builder()
      .addContactPoints(serverIP)
      .build();

    Session session = cluster.connect(keyspace);

    String cqlStatement = "INSERT INTO users JSON '{\"id\":888 , \"age\":21 ,\"state\":\"TX\"}'";
    session.execute(cqlStatement);
   }

}
Result
cqlsh:mykeyspace> select * from users;

 id   | age | state
------+-----+-------
 1745 |  12 | smith
  123 |  21 |    TX
  888 |  21 |    TX

Question:

Is there a way to serialize a general object to Json and deserialize Json to the object? If the object is an entity, I can use Jackson 2 lib to achieve the purpose. But if the object is a general class, how can I do this?

For example, I'd like to serialize com.datastax.driver.core.querybuilder.Update to Json and save it to DB, then search and deserialize it to Update object, finally use it as the argument of com.datastax.driver.core.Session.execute(Statement) to execuate.

Is it possible to reproduce Update object?

COMPLEMENT:

My real purpose is to store Update and then retrieve and execute it. So I thought I could save it as JSON String and then retrieve and deserialize it into the original Update instance and execute it. If It's not a good way, how can I store Update and then retrieve and execute it? Though I can store the text query of Update and the retrieve and execute the text query, some other information in Update maybe lose, like ConsistencyLevel.


Answer:

Just Convert the QueryBuilder.update into string using toString() method

If you want to store,retrieve and execute the QueryBuilder.update query, you don't have to serialized and deserialized it into json.

String query = QueryBuilder.update("exp").with(QueryBuilder.set("data", "This is a test data")).toString();
//Now you can store the text query directly to cassandra
//And retrieve the text query
session.execute(query); //Execute the query

Here is the code of toString()

@Override
public String toString() {
    try {
        if (forceNoValues)
            return getQueryString();
        // 1) try first with all values inlined (will not work if some values require custom codecs,
        // or if the required codecs are registered in a different CodecRegistry instance than the default one)
        return maybeAddSemicolon(buildQueryString(null, CodecRegistry.DEFAULT_INSTANCE)).toString();
    } catch (RuntimeException e1) {
        // 2) try next with bind markers for all values to avoid usage of custom codecs
        try {
            return maybeAddSemicolon(buildQueryString(new ArrayList<Object>(), CodecRegistry.DEFAULT_INSTANCE)).toString();
        } catch (RuntimeException e2) {
            // Ugly but we have absolutely no context to get the registry from
            return String.format("built query (could not generate with default codec registry: %s)", e2.getMessage());
        }
    }
}

You can see that It's returning the actual query string

Question:

I got the following List<ColumnFamilyDefinition> data from Cassandra

ThriftCfDef[
            keyspace=sample_data,
            name=application_data,
            columnType=STANDARD,
            comparatorType=me.prettyprint.hector.api.ddl.ComparatorType@757e1d2f,
            subComparatorType=<null>,
            comparatorTypeAlias=,
            subComparatorTypeAlias=,
            comment=,
            rowCacheSize=0.0,
            rowCacheSavePeriodInSeconds=0,
            keyCacheSize=0.0,
            readRepairChance=0.1,
            columnMetadata=[
                ThriftColumnDef[
                    name=java.nio.HeapByteBuffer[
                        pos=3208lim=3216cap=14616
                    ],
                    validationClass=org.apache.cassandra.db.marshal.UTF8Type,
                    indexType=<null>,
                    indexName=<null>
                ],
                ThriftColumnDef[
                    name=java.nio.HeapByteBuffer[
                        pos=3271lim=3280cap=14616
                    ],
                    validationClass=org.apache.cassandra.db.marshal.UTF8Type,
                    indexType=<null>,
                    indexName=<null>
                ]
            ],
            gcGraceSeconds=864000,
            keyValidationClass=org.apache.cassandra.db.marshal.UTF8Type,
            keyValidationAlias=,
            defaultValidationClass=org.apache.cassandra.db.marshal.BytesType,
            id=3656,
            maxCompactionThreshold=32,
            minCompactionThreshold=4,
            memtableOperationsInMillions=0.0,
            memtableThroughputInMb=0,
            memtableFlushAfterMins=0,
            keyCacheSavePeriodInSeconds=0,
            replicateOnWrite=true,
            compactionStrategy=org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy,
            compactionStrategyOptions={

            },
            compressionOptions={
                sstable_compression=org.apache.cassandra.io.compress.SnappyCompressor
            },
            mergeShardsChance=0.0,
            rowCacheProvider=<null>,
            keyAlias=<null>,
            rowCacheKeysToSave=0
        ]

I am getting the below error because of ThriftColumnDef

HttpMessageNotWritableException : Could not write JSON: (was java.nio.BufferUnderflowException) (through reference chain: java.util.HashMap["sample_data"]->java.util.UnmodifiableRandomAccessList[2]->me.prettyprint.cassandra.service.ThriftCfDef["columnMetadata"]->java.util.ArrayList[0]->me.prettyprint.cassandra.service.ThriftColumnDef["name"]->java.nio.HeapByteBuffer["long"]); nested exception is org.codehaus.jackson.map.JsonMappingException: (was java.nio.BufferUnderflowException) (through reference chain: java.util.HashMap["sample_data"]->java.util.UnmodifiableRandomAccessList[2]->me.prettyprint.cassandra.service.ThriftCfDef["columnMetadata"]->java.util.ArrayList[0]->me.prettyprint.cassandra.service.ThriftColumnDef["name"]->java.nio.HeapByteBuffer["long"])

Below is the complete stacktrace

org.springframework.http.converter.HttpMessageNotWritableException: Could not write JSON: (was java.nio.BufferUnderflowException) (through reference chain: java.util.HashMap["sample_data"]->java.util.UnmodifiableRandomAccessList[2]->me.prettyprint.cassandra.service.ThriftCfDef["columnMetadata"]->java.util.ArrayList[0]->me.prettyprint.cassandra.service.ThriftColumnDef["name"]->java.nio.HeapByteBuffer["long"]); nested exception is org.codehaus.jackson.map.JsonMappingException: (was java.nio.BufferUnderflowException) (through reference chain: java.util.HashMap["sample_data"]->java.util.UnmodifiableRandomAccessList[2]->me.prettyprint.cassandra.service.ThriftCfDef["columnMetadata"]->java.util.ArrayList[0]->me.prettyprint.cassandra.service.ThriftColumnDef["name"]->java.nio.HeapByteBuffer["long"])
    at org.springframework.http.converter.json.MappingJacksonHttpMessageConverter.writeInternal(MappingJacksonHttpMessageConverter.java:143)
    at org.springframework.http.converter.AbstractHttpMessageConverter.write(AbstractHttpMessageConverter.java:179)
    at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter$ServletHandlerMethodInvoker.writeWithMessageConverters(AnnotationMethodHandlerAdapter.java:1031)
    at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter$ServletHandlerMethodInvoker.handleResponseBody(AnnotationMethodHandlerAdapter.java:989)
    at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter$ServletHandlerMethodInvoker.getModelAndView(AnnotationMethodHandlerAdapter.java:938)
    at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.invokeHandlerMethod(AnnotationMethodHandlerAdapter.java:438)
    at org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter.handle(AnnotationMethodHandlerAdapter.java:424)
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923)
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852)
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882)
    at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:643)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:723)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:861)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:606)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
    at java.lang.Thread.run(Thread.java:662)
Caused by: org.codehaus.jackson.map.JsonMappingException: (was java.nio.BufferUnderflowException) (through reference chain: java.util.HashMap["sample_data"]->java.util.UnmodifiableRandomAccessList[2]->me.prettyprint.cassandra.service.ThriftCfDef["columnMetadata"]->java.util.ArrayList[0]->me.prettyprint.cassandra.service.ThriftColumnDef["name"]->java.nio.HeapByteBuffer["long"])
    at org.codehaus.jackson.map.JsonMappingException.wrapWithPath(JsonMappingException.java:218)
    at org.codehaus.jackson.map.JsonMappingException.wrapWithPath(JsonMappingException.java:183)
    at org.codehaus.jackson.map.ser.SerializerBase.wrapAndThrow(SerializerBase.java:131)
    at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:183)
    at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
    at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:430)
    at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
    at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
    at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:304)
    at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:254)
    at org.codehaus.jackson.map.ser.ContainerSerializers$AsArraySerializer.serialize(ContainerSerializers.java:142)
    at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:430)
    at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
    at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
    at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:304)
    at org.codehaus.jackson.map.ser.ContainerSerializers$IndexedListSerializer.serializeContents(ContainerSerializers.java:254)
    at org.codehaus.jackson.map.ser.ContainerSerializers$AsArraySerializer.serialize(ContainerSerializers.java:142)
    at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
    at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:212)
    at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:23)
    at org.codehaus.jackson.map.ser.StdSerializerProvider._serializeValue(StdSerializerProvider.java:600)
    at org.codehaus.jackson.map.ser.StdSerializerProvider.serializeValue(StdSerializerProvider.java:280)
    at org.codehaus.jackson.map.ObjectMapper.writeValue(ObjectMapper.java:1352)
    at org.springframework.http.converter.json.MappingJacksonHttpMessageConverter.writeInternal(MappingJacksonHttpMessageConverter.java:140)
    ... 24 more
Caused by: java.nio.BufferUnderflowException
    at java.nio.Buffer.nextGetIndex(Buffer.java:478)
    at java.nio.HeapByteBuffer.getLong(HeapByteBuffer.java:387)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.codehaus.jackson.map.ser.BeanPropertyWriter.get(BeanPropertyWriter.java:467)
    at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:402)
    at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
    ... 44 more

How do I get this thrift data convert to valid JSON to return as a response to the end user?


Answer:

The problem was with the below byte data which needs to be converted to String.

name=java.nio.HeapByteBuffer[
  pos=3208lim=3216cap=14616
]

Below method can be used to convert byte data to String.

Charset charset = Charset.forName("UTF-8");
CharsetDecoder decoder = charset.newDecoder();
String data = "";
int old_position = byteBuffer.position();
try {
    data = decoder.decode(java.nio.HeapByteBuffer[pos=3208lim=3216cap=14616]).toString();
    byteBuffer.position(old_position);
    System.out.println(data);
} catch (CharacterCodingException e) {
    e.printStackTrace();
}

Hope this helps solve anyone else problem.

Question:

I am the new learner in CQL. I am using the docker env to run the Cassandra.

In previous, I have the two tables(restaurants and Inspection) with inserted the data by csv and the following setting:

Since join method are not supported in CQL, I need to re-insert the joined data set(JSON) to a new table(call InspectionrestaurantNY).

Therefore, I tried to create the InspectionrestaurantNY table:

Then, I have the jav which help me to install the json file. But I got the error, and I don't know what table(InspectionrestaurantNY) setting should I create to insert the json data.

I ran the java -jar JSonFile2Cassandra.jar -host 192.168.99.101 -port 3000 -keyspace restaurantsNY -columnFamily InspectionsRestaurants -file InspectionsRestaurantsNY.json, it shown the following error:

And, my json file is stored as like this:

What table setting should I build up first to insert the JSON data?

How to solve the JAVA error?

Thank you so much.


Answer:

Seems like you are using wrong table name when running jar to insert JSON. The command you shared is

java -jar JSonFile2Cassandra.jar -host 192.168.99.101 -port 3000 -keyspace restaurantsNY -columnFamily InspectionsRestaurants -file InspectionsRestaurantsNY.json

Shouldnt it be

java -jar JSonFile2Cassandra.jar -host 192.168.99.101 -port 3000 -keyspace restaurantsNY -columnFamily InspectionsRestaurantsNY -file InspectionsRestaurantsNY.json

i.e use correct table name InspectionsRestaurantsNY for -columnFamily argument in above command.

Also its always better not to use camel case convention as CQL identifier names are case-insensitive. If you really really want case sensitive names then you should enclose names in double quotes. If double quotes are not used then Cassandra will convert names with mixed case to lower case. But in above query I dont think that is cause of error. I think its due to wrong column family name.

Check here for mixed cases names https://docs.datastax.com/en/cql/3.3/cql/cql_reference/ucase-lcase_r.html