3

In the process of learning golang, I'm trying to write a web app with multiple image upload functionality.

I'm using Azure Blob Storage to store images, but I am having trouble streaming the images from the multipart request to Blob Storage.

Here's the handler I've written so far:

func (imgc *ImageController) UploadInstanceImageHandler(w http.ResponseWriter, r *http.Request, p httprouter.Params) { reader, err := r.MultipartReader() if err != nil { http.Error(w, err.Error(), http.StatusInternalServerError) return } for { part, partErr := reader.NextPart() // No more parts to process if partErr == io.EOF { break } // if part.FileName() is empty, skip this iteration. if part.FileName() == "" { continue } // Check file type if part.Header["Content-Type"][0] != "image/jpeg" { fmt.Printf("\nNot image/jpeg!") break } var read uint64 fileName := uuid.NewV4().String() + ".jpg" buffer := make([]byte, 100000000) // Get Size for { cBytes, err := part.Read(buffer) if err == io.EOF { fmt.Printf("\nLast buffer read!") break } read = read + uint64(cBytes) } stream := bytes.NewReader(buffer[0:read]) err = imgc.blobClient.CreateBlockBlobFromReader(imgc.imageContainer, fileName, read, stream, nil) if err != nil { fmt.Println(err) break } } w.WriteHeader(http.StatusOK) 

}

In the process of my research, I've read through using r.FormFile, ParseMultipartForm, but decided on trying to learn how to use MultiPartReader.

I was able to upload an image to the golang backend and save the file to my machine using MultiPartReader.

At the moment, I'm able to upload files to Azure but they end up being corrupted. The file sizes seem on point but clearly something is not working.

Am I misunderstanding how to create a io.Reader for CreateBlockBlobFromReader?

Any help is much appreciated!

2 Answers 2

4

As @Mark said, you can use ioutil.ReadAll to read the content into a byte array, the code like below.

import ( "bytes" "io/ioutil" ) partBytes, _ := ioutil.ReadAll(part) size := uint64(len(partBytes)) blob := bytes.NewReader(partBytes) err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil) 

According to the godoc for CreateBlockBlobFromReader, as below.

The API rejects requests with size > 64 MiB (but this limit is not checked by the SDK). To write a larger blob, use CreateBlockBlob, PutBlock, and PutBlockList.

So if the size is larger than 64MB, the code shoule be like below.

import "encoding/base64" const BLOB_LENGTH_LIMITS uint64 = 64 * 1024 * 1024 partBytes, _ := ioutil.ReadAll(part) size := uint64(len(partBytes)) if size <= BLOB_LENGTH_LIMITS { blob := bytes.NewReader(partBytes) err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil) } else { // Create an empty blob blobClient.CreateBlockBlob(container, fileName) // Create a block list, and upload each block length := size / BLOB_LENGTH_LIMITS if length%limits != 0 { length = length + 1 } blocks := make([]Block, length) for i := uint64(0); i < length; i++ { start := i * BLOB_LENGTH_LIMITS end := (i+1) * BLOB_LENGTH_LIMITS if end > size { end = size } chunk := partBytes[start: end] blockId := base64.StdEncoding.EncodeToString(chunk) block := Block{blockId, storage.BlockStatusCommitted} blocks[i] = block err = blobClient.PutBlock(container, fileName, blockID, chunk) if err != nil { ....... } } err = blobClient.PutBlockList(container, fileName, blocks) if err != nil { ....... } } 

Hope it helps.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for answering @Peter, do you know why CreateBlockBlobFromReader requires the size up front, alongside the reader?
@Mark Azure Storage SDK for Golang or other languages is wrappered Storage REST APIs. The Put Blob/Put Block require Content-Length header in the rest request.
0

A Reader can return both an io.EOF and a valid final bytes read, it looks like the final bytes (cBytes) is not added to read total bytes. Also, careful: if an error is returned by part.Read(buffer) other than io.EOF, the read loop might not exit. Consider ioutil.ReadAll instead.

CreateBlockBlobFromReader takes a Reader, and part is a Reader, so you may be able to pass the part in directly.

You may also want to consider Azure block size limits might be smaller than the image, see Asure blobs.

2 Comments

Hi Mark thanks for answering. I was trying to see if I could pass in part directly to CreateBlockBlobFromReader, but I get an error from Azure saying that the body length is 0 and therefore there is a mismatch with the Content-Length (what I pass in as 'read'). Do you know why that is? I'll try passing in part directly again.
Andrew see Peter's helpful answer above.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.