Uploaded image for project: 'Bitbucket Server'
  1. Bitbucket Server
  2. BSERV-10055

As an Bitbucket admin i need to restrict and enforce file size commit limits at a global level

    XMLWordPrintable

    Details

    • UIS:
      55
    • Feedback Policy:
      We collect Bitbucket feedback from various sources, and we evaluate what we've collected when planning our product roadmap. To understand how this piece of feedback will be reviewed, see our Implementation of New Features Policy.

      Description

      An issue arose on our server the other day where a single commit from an intern caused resources to become constrained and eventually exhausted. This was a result from git command processes exhausting the CPU and tickets were getting queued and eventually dropped. 

      We have the add-on https://marketplace.atlassian.com/plugins/org.christiangalsterer.stash-filehooks-plugin/server/overview that does this on a per-repository basis. However at our current scale there is no way to enforce this kind of thing globally. Similarly in the way Github does https://help.github.com/articles/working-with-large-files/ We have git-lfs enabled and would rather large files be stored via git-lfs instead. So we want to allow large files in git just in the right way.  

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              markus.kobold Mark Kobold
              Votes:
              19 Vote for this issue
              Watchers:
              16 Start watching this issue

                Dates

                Created:
                Updated:

                  Backbone Issue Sync

                  • Backbone Issue Sync is enabled for your project, but there is no synchronization info for this issue.