Hot questions for Spring Data for MongoDB

Hot questions for Spring Data for MongoDB

Top 10 Java Open Source / Spring / Spring Data for MongoDB


I am trying to execute an aggregate operation using Spring Data MongoDB 3.6-rc4.

Aggregation agg = newAggregation(
    lookup("orders", "orderId", "_id", "order") 
List<BasicDBObject> results = mongoOperations.aggregate(agg, "transactions", BasicDBObject.class).getMappedResults();

But get the following error on running the query

2017-11-24 17:03:41,539 WARN : Command execution of { "aggregate" : "transactions" , "pipeline" : [ { "$lookup" : { "from" : "orders" , "localField" : "orderId" , "foreignField" : "_id" , "as" : "order"}}]} failed: The 'cursor' option is required, except for aggregate with the explain argument
2017-11-24 17:03:41,574 ERROR org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.dao.InvalidDataAccessApiUsageException: Command execution failed:  Error [The 'cursor' option is required, except for aggregate with the explain argument], Command = { "aggregate" : "transactions" , "pipeline" : [ { "$lookup" : { "from" : "orders" , "localField" : "orderId" , "foreignField" : "_id" , "as" : "order"}}]}; nested exception is com.mongodb.MongoCommandException: Command failed with error 9: 'The 'cursor' option is required, except for aggregate with the explain argument' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "The 'cursor' option is required, except for aggregate with the explain argument", "code" : 9, "codeName" : "FailedToParse" }] with root cause
com.mongodb.MongoCommandException: Command failed with error 9: 'The 'cursor' option is required, except for aggregate with the explain argument' on server localhost:27017. The full response is { "ok" : 0.0, "errmsg" : "The 'cursor' option is required, except for aggregate with the explain argument", "code" : 9, "codeName" : "FailedToParse" }
    at com.mongodb.CommandResult.getException( ~[mongo-java-driver-3.5.0.jar:na]
    at com.mongodb.CommandResult.throwOnError( ~[mongo-java-driver-3.5.0.jar:na]
    at ~[spring-data-mongodb-1.10.8.RELEASE.jar:na]
    at ~[spring-data-mongodb-1.10.8.RELEASE.jar:na]
    at ~[spring-data-mongodb-1.10.8.RELEASE.jar:na]

Thanks in advance!!


MongoDB changed in 3.6 how the aggregation command works. Aggregations require now a cursor. We adapted Spring Data MongoDB 2.1 but not previous versions.

Aggregations must be invoked through the collection's aggregate(…) method instead of calling the command directly. This is also the reason why we didn't backport the change. executeCommand(…) is no longer called and we don't want to break compatibility in a bugfix release.

The easiest approach for you can be to override the aggregate(…) method and call the appropriate method, DBCollection.aggregate(…) with the mapped aggregation pipeline.


I have a very simple Spring Boot application that uses Spring-Data-Mongodb

All I want to do is set a JSR-303 validation rule that says the object I'm saving must have a username. I read that JSR-303 was added to spring-data-mongodb in version 1.1 so I assumed that when I save an object it's validated but this isn't the case.

Does anyone have a simple example setup that shows how this works?

My User pojo looks like

public class User {

    private String id;

    @NotNull(message = "User Name is compulsory")
    private String userName;
    private String password;

    public User() {}

    public String getId() {
      return id;
    public void setId(String id) { = id;

    public String getUserName() {
      return userName;
    public void setUserName(String userName) {
      this.userName = userName;

    public String getPassword() {
      return password;
    public void setPassword(String password) {
      this.password = PasswordAuthService.hash(password);

I saw somewhere that validation only kicks in if you have a validator created in the context so I tried updating my Application class (which contains all the configuration, to look like

public class Application {

    public Validator getValidator() {
      LocalValidatorFactoryBean validator = new LocalValidatorFactoryBean();
      return validator;

    public static void main(String[] args) {, args);



First make sure that you have JSR-303 validator on classpath, for example:


If you use Java config, the way to go is to create 2 beans:

public ValidatingMongoEventListener validatingMongoEventListener() {
    return new ValidatingMongoEventListener(validator());

public LocalValidatorFactoryBean validator() {
    return new LocalValidatorFactoryBean();

Voilà! Validation is working now.


I write some code.I want to make questionId field in BaseQuestion Class as Autogenerated.Any solution for that? I am not using jpa i can't use @Generatedvalue annotation.So how we show here this field is auto generated. code is below.


    <?xml version="1.0" encoding="UTF-8"?>
    <project xmlns="" xmlns:xsi=""
    <description>Demo project for Spring Boot</description>
        <relativePath/> <!-- lookup parent from repository -->


package model;

import java.util.List;


@Document(collection = "basequestion")
public class BaseQuestion {
    private String id;
    private int questionId;
    private String responseType;
    private boolean required;
    private boolean active;
    private String questionCode;
    private QuestionText questionText;
    private String category;
    private List<Responses> responses;

    public QuestionText getQuestionText() {
        return questionText;

    public void setQuestionText(QuestionText questionText) {
        this.questionText = questionText;

    public List<Responses> getResponses() {
        return responses;

    public void setResponses(List<Responses> responses) {
        this.responses = responses;

    public int getQuestionId() {
        return questionId;

    public void setQuestionId(int questionId) {
        this.questionId = questionId;

    public String getResponseType() {
        return responseType;

    public void setResponseType(String responseType) {
        this.responseType = responseType;

    public boolean getRequired() {
        return required;

    public void setRequired(boolean required) {
        this.required = required;

    public String getQuestionCode() {
        return questionCode;

    public void setQuestionCode(String questionCode) {
        this.questionCode = questionCode;

    public String getCategory() {
        return category;

    public void setCategory(String category) {
        this.category = category;

    public boolean isActive() {
        return active;

    public void setActive(boolean active) { = active;

package repository;


import model.BaseQuestion;

public interface AuditProjectRepository extends MongoRepository<BaseQuestion, String> {

    public BaseQuestion findByQuestionId(int questionId);

    public BaseQuestion findByQuestionCode(String questionCode);

    public Long deleteByQuestionId(int questionid);



MongoDB came with all sophisticated ObjectId generation feature, but often you just jumped the ship from relational database, and you still want an easy to read / communicate numeric identifier field which automatically increments every time new record is inserted.

One neat suggestion from MongoDB tutorial is to use a counter collection with a ‘counter name’ as its id, and a ‘seq’ field to store the last used number.

When developing using Spring Data MongoDB, this neat trick can be written as a simple service. Here I used the collection name as the counter name so it’s easy to guess / remember.

import static;
import static;
import static;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

import com.model.CustomSequences;

public class NextSequenceService {
    @Autowired private MongoOperations mongo;

    public int getNextSequence(String seqName)
        CustomSequences counter = mongo.findAndModify(
            new Update().inc("seq",1),
        return counter.getSeq();

CustomSequences is just a simple class representing the collection. Please beware the usage of int data type, this will limit to 2^31 entries maximum.


@Document(collection = "customSequences")
public class CustomSequences {
    private String id;
    private int seq;

// getters and setters

Then when inserting a new entry (with help of Spring MongoDB Repository support), just set the id field like this before you save it

BaseQuestion baseQuestion = new BaseQuestion();
/* Rest all values */;

If you don't like this way then you need to use MongoDBEvents and use onBeforeConvert to generate automated value using same above approach.

Also above approach is threadsafe as findAndModify() is a thread safe atomic method


I have a Query with Pageable:

Query query = new Query().with(new PageRequests(page, size))

How can I execute it with MongoTemplate ? I don't see a single method returning Page<T>.


It's true that the MongoTemplate doesn't have findXXX with Pageables.

But you can use the Spring Repository PageableExecutionUtils for that.

In your example it would look like this:

Pageable pageable = new PageRequests(page, size);
Query query = new Query().with(pageable);
List<XXX> list = mongoTemplate.find(query, XXX.class);
return PageableExecutionUtils.getPage(
                       () -> mongoTemplate.count(query, XXX.class));

Like in the original Spring Data Repository, the PageableExecutionUtils will do a count request and wrap it into a nice Page for you.

Here you can see that spring is doing the same.


I am using spring-data mongo with the JSON based query methods, and am unsure how to allow optional parameters in a search query.

For instance - say I had the following function

@Query("{ 'name' : {$regex : ?0, $options : 'i'}, 'createdDate' : {$gte : ?1, $lt : ?2 }} }")
List<MyItem> getItemsLikeNameByDateRange(String name, Date startDateRange, Date endDateRange);

-but I didnt want to apply the name regex match, or not apply a date range restriction if NULL values were passed to the method.

At the moment it looks like I might have to build the query using the mongoTemplate.

Are there any alternatives - or is using mongoTemplate the best option?



To implement this in Boolean logic I do the following and the conversion to operations that are available in programming languages

:query != null -> field == :query
!(:query != null) || (field == :query)
(:query == null) || (field == :query)

In plain SQL, this is done as

where (null = :query) or (field = :query)

In MongoDB this is done through the $where

{ $where: '?0 == null || this.field == ?0' } 

We can speed this up a little by using Mongo Operations rather than building everything to the function at the expense of some readability. does not work unfortunately.

{ $or : [ { $where: '?0 == null' } , { field : ?0 } ] } 

So what you have is

@Query("{ $or : [ { $where: '?0 == null' } , { field : ?0 } ] }")
List<Something> findAll(String query, Pageable pageable);

This can be further expanded to handle arrays for in/all clauses

@Query("{ $or : [ { $where: '?0.length == 0' } , { field : { $in : ?0 } } ] }")
List<Something> findAll(String query, Pageable pageable);


I recently discovered GridFS which I'd like to use for file storage with metadata. I just wondered if it's possible to use a MongoRepository to query GridFS? If yes, can someone give me an example?

I'd also take a solution using Hibernate, if there is some.

The reason is: My metadata contains a lot of different fields and it would be much easier to query a repository than to write some new Query(Criteria.where(...)) for each scenario. And I hopefully could also simply take a Java object and provide it via REST API without the file itself.

EDIT: I'm using

  • Spring 4 Beta
  • Spring Data Mongo 1.3.1
  • Hibernate 4.3 Beta


There is a way to solve this:

public class MyGridFsFile {

    private ObjectId id;
    public ObjectId getId() { return id; }

    private String filename;
    public String getFilename() { return filename; }

    private long length;
    public long getLength() { return length; }



You can write a normal Spring Mongo Repo for that. Now you can at least query the fs.files collection using a Spring Data Repo. But: You cannot access the file contents this way.

For getting the file contents itself, you've got (at least) 2 options:

  1. Use file = gridOperations.findOne(Query.query(Criteria.where("_id").is(id))); InputStream is = file.getInputStream();

  2. Have a look at the source code of GridFSDBFile. There you can see, how it internally queries the fs.chunks collection and fills the InputStream.

(Option 2 is really low level, Option 1 is a lot easier and this code gets maintained by the MongoDB-Java-Driver devs, though Option 1 would be my choice).

Updating GridFS entries:

  • GridFS is not designed to update file content!
  • Though only updating the metadata field can be useful. The rest of the fields is kinda static.

You should be able to simply use your custom MyGridFsFileRepo's update method. I suggest to only create a setter for the metadata field.

Different metadata for different files:

I solved this using an abstract MyGridFsFile class with generic metadata, i.e.:

public abstract class AbstractMyGridFsFile<M extends AbstractMetadata> {


    private M metadata;
    public M getMetadata() { return metadata; }
    void setMetadata(M metadata) { this.metadata = metadata; }


And of course each impl has its own AbstractMetadata impl associated. What have I done? AbstractMetadata always has a field called type. This way I can find the right AbstractMyGridFsFile impl. Though I have also a generic abstract repository.

Btw: In the meantime I switched here from using Spring Repo, to use plain access via MongoTemplate, like:

protected List<A> findAll(Collection<ObjectId> ids) {
    List<A> files = mongoTemplate.find(Query.query(Criteria
            .and("metadata.type").is(type) // this is hardcoded for each repo impl
    ), typeClass); // this is the corresponding impl of AbstractMyGridFsFile
    return files;

Hope this helps. I can write more, if you need more information about this. Just tell me.


I have the following class that I want to store in MongoDB using Spring Data

public class Tuple2<T extends Enum<T>> {

private String id;

@DateTimeFormat(iso = DateTimeFormat.ISO.DATE_TIME)
private final Instant timeCreated;


DateTimeFormat annotation javadoc states:

Declares that a field should be formatted as a date time. Supports formatting by style pattern, ISO date time pattern, or custom format pattern string. Can be applied to java.util.Date, java.util.Calendar, java.long.Long, Joda-Time value types; and as of Spring 4 and JDK 8, to JSR-310 java.time types too.

I am using Spring 4.1.1 and JDK 8, so I'd expect that it applies to Instant. However, here's what is actually stored:

"timeCreated" : {
    "seconds" : NumberLong(1416757496),
    "nanos" : 503000000

If I write and register custom convertor from Instant to Date like explained in this answer then it works, however I'd like to avoid that, as I am sure there must be a better way.

After further digging in Spring source code I've found the following class Jsr310DateTimeFormatAnnotationFormatterFactory which looks promising:

Formats fields annotated with the DateTimeFormat annotation using the JSR-310 java.time package in JDK 8.

Its' source does not reference Instant, but it does reference OffsetTime and LocalTime. Even so, when I change Instant to be OffsetDateTime in my example, it is still stored as a composite object instead of ISODate.

What is missing?


I think the problem is what you are trying to use Instant as a time. Conceptually it is a point of the timeline and it does not imply formatting.

As we know, Java 8 time API was developed with an eye at joda-time (and with partisipating of joda-time's developers). Here is comment from joda-time Instant:

An Instant should be used to represent a point in time irrespective of any other factor, such as chronology or time zone.

That's why there is no formatting possibilities for org.joda.time.Instant in JodaDateTimeFormatAnnotationFormatterFactory which appeared in Spring since version 3.0. And also it was not implemented in Jsr310DateTimeFormatAnnotationFormatterFactory

So, you should use custom converter or consider to use more suitable class.


I'm facing Spring Data MongoDB Criteria API orOperator problem.

Here's query result for irregular verbs: (Terminal output)

> db.verb.find({'v2':'wrote'});
{ "_id" : ObjectId("5161a8adba8c6390849da453"), "v1" : "write", "v2" : "wrote", "v3" : "written" }

And I query verbs by their v1 or v2 values using Spring Data MongoDB Criteria API:

Criteria criteriaV1 = Criteria.where("v1").is(verb);
Criteria criteriaV2 = Criteria.where("v2").is(verb);
Query query = new Query(criteriaV1.orOperator(criteriaV2));
List<Verb> verbList = mongoTemplate.find(query, Verb.class)

But unfortunately verbList doesn't have any item.


As far as I remember in order to use orOperator you should do:

Query query = new Query(new Criteria().orOperator(criteriaV1,criteriaV2));


I have the following exception when running Java app for MongoDB:

[localhost:27017] org.mongodb.driver.cluster : Exception in monitor thread while connecting to server localhost:27017 while accessing MongoDB with Java

Call stack is follows:

com.mongodb.MongoSocketOpenException: Exception opening socket
    at ~[mongodb-driver-core-3.0.4.jar:na]
    at ~[mongodb-driver-core-3.0.4.jar:na]
    at com.mongodb.connection.DefaultServerMonitor$ ~[mongodb-driver-core-3.0.4.jar:na]
    at [na:1.8.0_45]
Caused by: Connection refused: connect
    at Method) ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at ~[na:1.8.0_45]
    at com.mongodb.connection.SocketStreamHelper.initialize( ~[mongodb-driver-core-3.0.4.jar:na]
    at ~[mongodb-driver-core-3.0.4.jar:na]
    ... 3 common frames omitted

Neither of these names belong to my application. Also I have NO MONGODB server on local host. I am using remote host and setting it later. An exception occurs BEFORE any of my statements concerning Mongo.


This is probably some Spring provided beans accessing Mongo. How to disable them?

My config contains following dependencies:

dependencies {

//  compile('org.springframework.boot:spring-boot-starter-data-mongodb')



i.e. I have removed org.springframework.boot:spring-boot-starter-data-mongodb and was thinking will use Mongo myself...


I found related question: How to disable spring-data-mongodb autoconfiguration in spring-boot


I was to add exclusion annotation to my main annotated class,

i.e. instead of


I should have