0

I am having big problems trying to push my code to my bitbucket server. My push is stuck on:

Enumerating objects: 536, done. Counting objects: 100% (536/536), done. Delta compression using up to 12 threads Compressing objects: 100% (209/209), done. Writing objects: 82% (242/295), 5.05 MiB | 236.00 KiB/s 

The "Writing objects" has gotten over 2GB and then crashed with error:

I have tried git config --global http.postBuffer 524288000. I have read over a few articles and worked through guides: https://confluence.atlassian.com/stashkb/git-push-fails-fatal-the-remote-end-hung-up-unexpectedly-282988530.html

along with reading various StackOverflow questions to no avail.

I ran the command git diff --stat --cached origin/ and it returns the following: enter image description here

I have been working on this for days and really don't know what to do as I can't get my code pushed live to get to my staging server.

2

3 Answers 3

2

enter image description here

Like described in the attlassian support your repository has a hard cap of 2GB and prevents you from pushing when your above this limit.

A repository is meant to store source code mainly, why is your repo that big?

Sign up to request clarification or add additional context in comments.

4 Comments

The code changes should not be as large as it seems which is why I'm confused about it, what do you suggest I do to check this?
Are there any other files than your source code files that may unintended have fiound their way into your repo? like a database file, or any kind of data in the same folder or one of the its folders as your source code?
or one of its *subfolder
I added a new image to show you, I found there was one but removed it. Still not working though, I've left it over an hour to run
1

You must have accidentally committed some huge file (e.g., a bootleg copy of The Fifth Element as a 4.7 GB DVD image) and then removed that file from subsequent commits.

Using git diff is no help here—at least, not without a lot of care. Imagine you had three phases in your life: when you were 12, you were 40 kg, then somehow you ballooned to 200 kg during your college years, after which you lost most of that and are back to a more sensible 75 kg. Now, compare a photo of yourself when you were 12, to a photo of yourself today. What do you see? git diff effectively works by picking two snapshots—two photos of your source files—and comparing just those two photos. It completely skips over all the intermediate snapshots.

Git, however, doesn't skip all the intermediate snapshots: each of those is a commit. If you have a commit with a big file in it, the commit remains in the history, which is nothing more or less than the set of commits in the repository, even if you have subsequently slimmed down all the new commits.

You need to find the commits that contain the large files, and discard them and all subsequent commits in favor of new and improved commits that lack the large files. See How to remove/delete a large file from commit history in Git repository? and related questions.

10 Comments

I see, this actually makes sense! What I found when I did the git diff was that there was a big file in there. I then deleted it from the repo in hopes it would fix & rerun git dif where it wasn't showing, I also modified my .gitignore to help if it was some image I wasn't seeing.
I have tried running git filter-branch --force --index-filter \ 'git rm --cached --ignore-unmatch magento/magento/pub/media.zip' \ --prune-empty --tag-name-filter cat -- --all this seemed to have somewhat caused a rollback, however, it required me to pull and push the code. It created heaps of conflicts. I'm still having the same issue which is catching
Even after running the clear & running git diff I can't seem to see what the file is or how I can clear it, I'm hoping that I can push all the code and see if there are any code snippets which I can pull from all the commits to recreate a fully working version
The "however, it required me to pull" part is your mistake here. After filtering a repository you have a new repository that is not compatible with the old one. You must discard all old clones and use only new clones of the new repository. Using git pull means run git fetch, then git merge, i.e., you're telling Git that you'd like to combine all your new and improved commits with your original not-cleaned-up commits.
In general, this may require using git push --force, since sites like GitHub don't really let you delete-and-then-re-create a repository. This is somewhat dangerous: it throws away the old commits in favor of the new ones. But that's what you want, provided the new ones are correct and the old ones are now old-and-lousy. :-)
|
0

Had the same issue with Fork + BitBucket. I don't know what was the cause. But here are some insights:

  • I added commit with 3 images 0.7MB each.

  • I tried increasing the post buffer even though I've been working with much larger files before.

  • I checked my repo size - far below the limit.

  • Tried restarting the Fork app, entire system.

  • Tried removing branch from the origin and pushing it again.

What actually solved my case:

  • Created alternative branch before the problematic commit.

  • Pushed it to the origin with no issues.

  • Cherry picked the problematic commit.

  • Pushed it to the origin with no issues.

What's more interesting I also went back to my original branch and tried pushing once more. It went successfully with no issues.. I suppose there was some caching issue somewhere.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.