Cassandra throwing CodecNotFoundException [bigint <-> java.util.Date]

I have a java entity that has a attribute of Date type and I have a database table which stores the date attribute to bigint cloumn but when I run the code it gives me this error:

com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [bigint <-> java.util.Date]

Can you please help me with the exception cassandra is throwing and solution for that?


You are inserting java.util.Date to bigint column, that's why you are getting this error.

Use getTime() method to get times in milliseconds, which is long to inset into bigint column.

Example :

Date date = ; // you have the date
long timeInMilis = date.getTime();

Use timeInMilis to insert into cassandra

or

you can change the column type bigint to timestamp, then you can insert java.util.Date directly, don't have to get times in milliseconds,

-------------------------------------
| CQL3 data type    |   Java type    |
|-------------------|----------------|
|     bigint        |    long        |
|-------------------|----------------|
|    timestamp      | java.util.Date |
--------------------------------------

More on CQL - Java Mapping : https://docs.datastax.com/en/developer/java-driver/3.1/manual/#cql-to-java-type-mapping

Thrown when a suitable com.datastax.driver.core.TypeCodec cannot be found by com.datastax.driver.core.CodecRegistry instances.


I believe the issue is that you are attempting to store a java.util.Date object in a cql bigint. The type that maps to bigint in the java driver is a long (see 'CQL to Java type mapping' section of the docs).

Assuming you mean to store the epoch milliseconds in this column you have a few options.

  1. Change the column type to timestamp which maps to java.util.Date (and is set/accessed via setTiemstamp/getTimstamp).
  2. Use setLong in conjunction with Date.getTime() to convert the Date to a long representing epoch milliseconds.
  3. Create and register a custom codec that maps java.util.Date to bigint, i.e.:
import com.datastax.driver.core.*;

import java.util.Date;

public class CodecTest {

    static class DateToBigintCodec extends MappingCodec<Date, Long> {

        DateToBigintCodec() {
            // creates a mapping from bigint <-> Date.
            super(TypeCodec.bigint(), Date.class);
        }

        @Override
        protected Date deserialize(Long value) {
            return new Date(value);
        }

        @Override
        protected Long serialize(Date value) {
            return value.getTime();
        }
    }

    public static void main(String args[]) {
        TypeCodec<Date> codec = new DateToBigintCodec();
        Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
        try {
            // register custom codec
            cluster.getConfiguration().getCodecRegistry().register(codec);

            Date date = new Date();
            Session session = cluster.connect();
            // insert Date value into column v, which is a bigint.
            // schema:
            // CREATE TABLE simple.tbl (k int PRIMARY KEY, v bigint)
            PreparedStatement prepared = session.prepare("insert into simple.tbl (k, v) values (?, ?)");
            BoundStatement bound = prepared.bind();
            bound.setInt("k", 0);
            bound.setTimestamp("v", date);
            session.execute(bound);

            // Retrieve column v as a Date.
            Row row = session.execute("select v from simple.tbl").one();
            System.out.println(row.getTimestamp("v"));
        } finally {
            cluster.close();
        }
    }
}

DataStax Java Driver for Apache Cassandra / JAVA-1281. I can't get it to work without CodecNotFoundException. reports bigint as the type for the range, but it


I have a tried and tested solution to this problem Following is the code to insert or retrieve timestamp in Cassandra

import java.sql.Timestamp;
import com.datastax.driver.core.Session;

SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");

Timestamp timeStamp= Timestamp.valueOf(df.format(new Date()));

PreparedStatement ps = session.prepare("SELECT id,time from keySpace.tableName where 
timestamp >=?  order by timestamp allow filtering");

BoundStatement bs = ps.bind(timeStamp);

ResultSet rs = session.execute(bs);

if (rs != null) {
    for (Row row : rs) {
    row.getTimestamp(time);
    row.getString(id);
}}

Note:- session.execute(bs), if you guys want to know from where this session came refer the following link How to connect Cassandra using Java class https://stackoverflow.com/a/16871984/9292502

MySQl Table: CREATE TABLE category ( id bigint(20) NOT NULL AUTO_INCREMENT, name varchar(100) NOT NULL, status tinyint(4) NOT NULL, sort int(11) NOT NULL, percentage_in_product double NOT NULL DEFAULT '7', PRIMARY KEY (id), UNIQUE KEY name (name) ) Cassandra Table: CREATE TABLE category( id Bigint, name varchar, status Int, sort int, percentage_in_product Double, PRIMARY KEY (id) ); I'm using


Y est le CodecNotFoundException ? Est-ce à cause du chemin de classe (goyave) ? ou cassandra-pilote (cassandra-pilote-core-3.0.0-alpha4.jar pour datastax cassandra 3.2.1) ou à cause du code . un autre point est toutes les dates que j'insère dans les colonnes dont le type de données est timestamp .


In our case there isn’t any complex logic here. If the value is null return that, otherwise convert the value to a long or BIGINT in CQL. Fortunately the encoding of BIGINT values is handled elsewhere in the driver and we can reuse that code. Check out the implementation below.


The driver uses Cassandra's native protocol which is available starting from Cassandra 1.2. Some of the features (result set paging, BatchStatement, ) of this version 2.0 of the driver require Cassandra 2.0 however and will throw 'unsupported feature' exceptions if used against a Cassandra 1.2 cluster.