In context of Mule3, if we need to create zip file with n number of files in it, where n will keep changing as per our requirement.
2 ways in which we can achieve this
- Create a unique folder per request, and dump all the required files inside this folder, which will finally be zipped.
- Create ByteArrayOutputStream, attach it to ZipOutputStream and then create a ZipEntry associate every ZipEntry (Write) to ZipOutputStream, finally ByteArrayOutputStream would be our stream of zipfile content.
Approach 1 :
- Numerous ways of doing this one of the way is using walk file tree feature of Java NIO, where in we can zip the entire directory and sub-directory, exactly what we need.
- For Indepth understanding one of the good write ups, https://www.baeldung.com/java-nio2-file-visitor
Sample Code :
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.file.*;
import java.nio.file.attribute.BasicFileAttributes;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
public class ZipCompress {
public static void compress(String dirPath) {
final Path sourceDir = Paths.get(dirPath);
String zipFileName = dirPath.concat(“.zip”);
try {
final ZipOutputStream outputStream = new ZipOutputStream(new FileOutputStream(zipFileName));
Files.walkFileTree(sourceDir, new SimpleFileVisitor<Path>() {
@Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attributes) {
try {
Path targetFile = sourceDir.relativize(file);
outputStream.putNextEntry(new ZipEntry(targetFile.toString()));
byte[] bytes = Files.readAllBytes(file);
outputStream.write(bytes, 0, bytes.length);
outputStream.closeEntry();
} catch (IOException e) {
e.printStackTrace();
}
return FileVisitResult.CONTINUE;
}
});
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
- Another easy one is Apache Commons Inbuild Library to write file to a specific directory, then use ZipOutputStream to finally Zip it.
Sample Code :
FileUtils.copyInputStreamToFile(inputStream, folderLocationToPlaceTheFile);
- Common thing here would be to use ZipOutputStream, to finally zip the entire directory.
- Writing and reading files involved IO operation, which would be little slower, depending how many files we need to read / write.
Approach 2 :
- Suppose say as in the scenario described in approach 1, if we are getting file from an Objectstore (S3, Apache Minio, DB ..) and we are getting one file at a time, instead of writing everything to file system if we can zip the file in Memory would be lot faster.
- From Memory Consumption point of view, files should definetly not be huge, in one of the scenario I worked on, we needed to zip pdf files size hardly in 10’s of KB, so even if anypoint we were getting 1000’s of PDF for a particular client, we would not be even going to 100 MB (Load Testing Result), in this scenario, performance was more important and with different system interacting (as in our case) we decided to go with In Memory Zipping of the file.
Code Sample :
// In the below example, JavaPojoObject is a simple Pojo class, having relevant details
// about the filename, storeLocation etc. I feel this convenient to play around by
// creating list which could easily converted to Json back to Java or any other format.
try (ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
try (ZipOutputStream zos = new ZipOutputStream(baos);) {
for (JavaPojoObject obj : javaPojoObjectList) {
String filename = obj.getFileName();
byte[] fileContent = yourImplementation
ToGetFileFromYourObjectStore();
ZipEntry individualEntry = new ZipEntry(filename);
individualEntry.setSize(fileContent.length);
zos.putNextEntry(individualEntry);
zos.write(fileContent);
zos.closeEntry();
}
}
// baos – now has the entire zip file content, wisely do things you need to ..
}
