1

Hi i thought QMovie could take QBuffer? This is my code.

a = QByteArray(img) b = QBuffer(a) self.movie = QMovie(b, 'GIF') 
1
  • I do not know your basic problem, and the normal thing is to use a .qrc, but my answer corresponds to your question, so I think my answer is correct, if so, do not forget to mark it as correct, if you do not know how to do it, check the tour, that's the best way to thank. :) Commented Aug 14, 2018 at 13:21

1 Answer 1

2

You want to use the second constructor:

QMovie::QMovie(QIODevice *device, const QByteArray &format = QByteArray(), QObject *parent = nullptr)

and as you see it is expected that the second argument is a QByteArray that can be replaced by bytes, so in the next part I show you an example:

import sys from PyQt5 import QtCore, QtGui, QtWidgets if __name__ == '__main__': app = QtWidgets.QApplication(sys.argv) # load data from path = "congratulations.gif" file = QtCore.QFile(path) if not file.open(QtCore.QIODevice.ReadOnly): sys.exit(-1) ba = file.readAll() buf = QtCore.QBuffer(ba) if not buf.open(QtCore.QIODevice.ReadOnly): sys.exit(-1) movie = QtGui.QMovie(buf, b"gif") w = QtWidgets.QLabel() w.setMovie(movie) movie.start() w.show() sys.exit(app.exec_()) 
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.