I have JPA a entity class that contains a blob field like this:
@Entity
public class Report {
private Long id;
private byte[] content;
@Id
@Column(name = "report_id")
@SequenceGenerator(name = "REPORT_ID_GENERATOR", sequenceName = "report_sequence_id", allocationSize = 1)
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "REPORT_ID_GENERATOR")
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
@Lob
@Column(name = "content")
public byte[] getContent() {
return content;
}
public void setContent(byte[] content) {
this.content = content;
}
}
and I have some large data (over 3 gigs) inserted on a record of it in database (using a DBMS procedure).
Application users are supposed to be able to download the content of these records, so I have implemented a method that streams the fetched result to clients browsers.
Problem is, since JPQL select queries tend to fetch entire object from DB first and then give it to application, whenever I try to access this record using JPA I get unable to allocate enough memory exception.
I have seen some solution for this problem using JDBC connections that try to stream data from database, but I could not any JPA specific solution for it.
Does anyone have any clue how to solve should I solve this problem?
This is a late answer, but for those still looking for a solution, I found a good article by Thorben Janssen on Thoughts on Java blog. The drawback, it's Hibernate specific, but you seem to use it anyway. Basicly the solution is to use java.sql.Blob data type attributes in your entity
@Entity
public class Book {
@Id
@GeneratedValue
private Long id;
private String title;
@Lob
private Clob content;
@Lob
private Blob cover;
...
}
And then you use Hibernate’s BlobProxy, which provides an OutputStream. But take a look at the article here