I have some csv files that are larger than github's file size limit of 100.00 MB. I have been trying to use the Git Large File Storage extension.
From LFS - "Large file versioning- Version large files—even those as large as a couple GB in size—with Git."
I have applied the following on the folders of concern:
git lfs track "*.csv"
However, when I push:
remote: error: File Time-Delay-ftn/Raw-count-data-minor-roads1.csv is 445.93 MB; this exceeds GitHub's file size limit of 100.00 MB
remote: error: File Time-Delay-ftn/Raw-count-data-major-roads.csv is 295.42 MB; this exceeds GitHub's file size limit of 100.00 MB
When I look at the folder in question:
-rw-r----- 1 user staff 42B 23 Oct 12:34 .gitattributes
-rw-r--r-- 1 user staff 1.3K 19 Oct 14:32 DfT_raw_major_manipulation.py
-rw-r--r-- 1 user staff 1.2K 16 Oct 15:08 DfT_raw_minor_manipulation.py
drwxr-xr-x 21 user staff 714B 22 Oct 11:35 Driving/
-rwxr-xr-x@ 1 user staff 295M 19 Oct 14:47 Raw-count-data-major-roads1.csv*
-rwxr-xr-x@ 1 user staff 446M 16 Oct 14:52 Raw-count-data-minor-roads1.csv*
when I vim the .gitattributes file you can see the lfs setup:
*.csv filter=lfs diff=lfs merge=lfs -text
What am I doing wrong?
UPDATE
When I query
git lfs ls-files
I get nothing returned. This indicates that despite the .csv filter being successfully applied to the .gitattributes file the csv files are not being picked up by lfs
Simply adding git-lfs configuration to an existing repository will not retroactively convert your large files to LFS support. Those large files will remain in your history and GitHub will refuse your pushes.
You need to rewrite your history to introduce git-lfs to your existing commits. I recommend the BFG repo cleaner tool, which added LFS support recently.
You should be able to convert historical usage of your CSV files by:
$ java -jar ~/bfg-1.12.5.jar --convert-to-git-lfs '*.csv' --no-blob-protection