Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
371 views
in Technique[技术] by (71.8m points)

javafx - getPublicStorage("Pictures") lists no files

Calling

File picturesDir = Services.get(StorageService.class)
            .flatMap(s -> s.getPublicStorage("Pictures"))
            .orElseThrow(() -> new RuntimeException("Error retrieving public storage")); 
for (File pic : picturesDir.listFiles()) {
        System.out.println("file " + pic.getName());
}

lists no files. I think it should list all files from my Image-Gallery on iPhone.

Calling s.getPublicStorage("") it lists two folders though: gluon, Pictures.

How can i access it properly?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

As discussed previously in this question, on mobile there is no Swing or AWT, so once you have a JavaFX image, to retrieve it you can't use SwingFXUtils.

The solution proposed in the mentioned question works on Android as you can easily get the File. On iOS, on the contrary, the file is located in the gallery and the current Charm Down PicturesService doesn't access the gallery.

Instead of modifying that service (with native code like this one), the idea to get a byteArray from the JavaFX Image that you can send to a web service is based on two steps:

  • Get the image pixels as a byte array
  • Encode the byte array into a string.

If you check the iOS implementation for PicturesService::takePhoto, the image from the iOS native layer is sent to the JavaFX layer via Base64 encoding and decoding.

Solution 1

This code snippet works for me:

// Take a photo and add image to imageView
Button button = new Button("Take Photo");
button.setOnAction(e ->
    Services.get(PicturesService.class)
        .ifPresent(s -> s.takePhoto(false).ifPresent(imageView::setImage)));

// Encode image
imageView.imageProperty().addListener((obs, ov, image) -> {
    if (image != null) {
        // 1. image to byte array
        PixelReader pixelReader = image.getPixelReader();
        int width = (int) image.getWidth(); 
        int height = (int) image.getHeight(); 
        byte[] buffer = new byte[width * height * 4]; 
        pixelReader.getPixels(0, 0, width, height, PixelFormat.getByteBgraInstance(), buffer, 0, width * 4); 

        // 2. Encode to String
        String encoded = Base64.getEncoder().encodeToString(buffer);

        // 3. send string...
    }
} 

You can check the process works by reversing these steps and creating an image out of the encoded string:

byte[] imageBytes = Base64.getDecoder().decode(encoded.getBytes(StandardCharsets.UTF_8));

WritablePixelFormat<ByteBuffer> wf = PixelFormat.getByteBgraInstance();
WritableImage writableImage = new WritableImage(width, height);
PixelWriter pixelWriter = writableImage.getPixelWriter();
pixelWriter.setPixels(0, 0, width, height, wf, imageBytes, 0, width * 4);

imageView.setImage(writableImage);

Solution 2

If you want to transform the byte array from PixelReader::getPixels into a BufferedImage, that won't work: the byte arrays are different.

So you need to use something that the buffered image will be able to process.

Having a look at the SwingFXUtils implementation, it uses an int array instead.

So this is another possibility:

// Take a photo and add image to imageView
Button button = new Button("Take Photo");
button.setOnAction(e ->
    Services.get(PicturesService.class)
        .ifPresent(s -> s.takePhoto(false).ifPresent(imageView::setImage)));

// Encode image
imageView.imageProperty().addListener((obs, ov, image) -> {
    if (image != null) {
        // 1. image to int array
        PixelReader pixelReader = image.getPixelReader();
        int width = (int) image.getWidth(); 
        int height = (int) image.getHeight(); 
        int[] data = new int[width * height]; 
        pixelReader.getPixels(0, 0, width, height, PixelFormat.getIntArgbPreInstance(), data, 0, width); 

        // 2. int array to byte array
        ByteBuffer byteBuffer = ByteBuffer.allocate(data.length * 4);        
        IntBuffer intBuffer = byteBuffer.asIntBuffer();
        intBuffer.put(data);


        // 3. Encode to String
        String encoded = Base64.getEncoder().encodeToString(byteBuffer.array());

        // 4. send string...
    }
} 

Now you will have to decode the string, get the int array and create the buffered image:

// 1. Decode string
byte[] imageBytes = Base64.getDecoder().decode(encoded.getBytes(StandardCharsets.UTF_8));

// 2. get int array
ByteBuffer byteBuffer2 = ByteBuffer.wrap(imageBytes);
IntBuffer intBuffer2 = byteBuffer2.asIntBuffer();
int[] imageData = new int[intBuffer2.limit()];
intBuffer2.get(imageData);

// 3. create buffered image
BufferedImage bufferedImage = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
bufferedImage.setRGB(0, 0, width, height, imageData, 0, width);

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...