70

What's the simplest one-liner to get the last commit date for a bunch of files in a Git repository (i.e., for each file, the date of the last commit that changed that file)?

The context for this is that I'm using other Bash commands to find a long list of files matching some criteria, and I'd like to just pipe this list into a Git command to get the last commit date for each file.

2
  • Were you ever able to create a git alias or whatever do dump out commit dates of bunch of files? I have a parent folder with files and folders/files, etc. I want to see last time anything in that folder changed. Commented Apr 27, 2015 at 14:01
  • 3
    @terry for i in target/*; do git log -1 --format=%ci $i; done | sort | tail -1 Commented May 2, 2018 at 15:44

6 Answers 6

98

The following command will be helpful:

git log -1 --format=%cd filename.txt 

This will print the latest change date for one file. The -1 shows one log entry (the most recent), and --format=%cd shows the commit date. See the documentation for git-log for a full description of the options.

You should be able to easily extend this for a group of files.

Sign up to request clarification or add additional context in comments.

5 Comments

This is helpful. Is there a better way to attach each file's name to the date than something like the following: for x in $(some-command); do echo $x $(git log -1 --format=%cd $x); done?
I don't think so, since all the % formatting codes refer to a commit rather than an individual file. echo might be the most straightforward way there.
Worth pointing out that %ci outputs ISO (ish) formatted dates; so you can sort them. Also %ai outputs the author date, rather than the committer date.
git ls-tree -r --name-only HEAD | xargs -IF git --no-pager log -1 --format='%cI F' F works for my use case.
@TerryBrown I would add a -- just before the final F, but it works very fine, thanks !
12

Slightly extending Greg's answer, git log can take multiple paths in its argument. It will then show only the commits which included those paths. Therefore, to get the latest commit for a group of files:

git log -1 --format=%cd -- fileA.txt fileB.txt fileC.txt 

I'm pretty rubbish at shell scripting, so I'm not exactly sure how to construct that command via piping, but maybe that'd be a good topic for another question.

3 Comments

Yeah, but that returns the latest commit amongst all the files' commits, instead of the latest commit for each file.
This works but sadly not inside a TravisCI environment. When TravisCI clones the repo all file dates are the date and time of the clone (today/now) and not the date and time of what is in the repo. Very frustrating.
I logged an issue with TravisCI about this here if anyone else is interested > github.com/travis-ci/travis-ci/issues/8539
10

Use git ls-files to find git files, and then git log to format the output. But since git log will group several file together when they share same commit time, I prefer to have it process one file at a time and then sort the result.

The resulted one-liner:

for f in $(git ls-files); do git --no-pager log --color -1 --date=short --pretty=format:'%C(cyan)%ai%Creset' -- $f ; echo " $f"; done |sort -r 

If you want to add it to your .bashrc:

function gls() { for f in $(git ls-files); do git --no-pager log --color -1 --date=short --pretty=format:'%C(cyan)%ai%Creset' -- $f ; echo " $f"; done |sort -r } 

Then running gls will output something like:

2019-09-30 11:42:40 -0400 a.c 2019-08-20 11:59:56 -0400 b.conf 2019-08-19 16:18:00 -0400 c.c 2019-08-19 16:17:51 -0400 d.pc 

The result is in time descending order.

1 Comment

It does not work if some filepaths have spaces in them :D
5

To get the last commit date in a long(Unix epoch timestamp) format in git(for any file) use the following command.

  • Command: git log -1 --format=%ct filename.txt
  • Result: 1605094689

Note:

  1. You can specify any file along with an extension in the git project.
  2. You can visit the git-log documentation to get a more detailed description of the options.

Comments

2

Here's a one liner using find (broken into several for readability, but thanks to the trailing backslashes, copy–paste should work):

find <dirs...> -name '<pattern>' <any other predicate to get what you want> \ -exec git log -1 --format="AAA%ai NNN" '{}' \; \ -exec echo '{}' XXX \; \ | tr \\n N | sed -e 's/AAA/\n/g' | sed -e 's/NNNN/ /g' -e 's/XXX.*//g' 

The overly complex newline mangling with tr and sed is just there to get date and filename on one line, and to ignore untracked files. You have to make sure that none of your files contain those silly markers AAA XXX NNNN.

Comments

1

With PowerShell (tested on pwsh 7.2), you can get all files in a directory (optionally add option -Recurse) and then run git log for each file. To associate each file with the output from git, we bundle the two into a psobject. We can then sort these by property.

dir "path to my dir" -File | % { New-Object psobject -Property @{Commit = (git log -1 --format='%ci %s' $_); FileName = $_.Name} } | sort Commit -desc | ft 

Output looks like this:

FileName Commit -------- ------ file1 2023-11-30 17:22:52 -0500 commit message1 file2 2023-09-27 08:41:04 -0400 commit message2 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.