Cause#
Today I wanted to back up the experimental results deployed on the server using Alist, but after the upload was completed, Alist reported a 413 error. 413 Content Too Large indicates that the size of the request body exceeds the limit that the server is willing or able to process. This made me feel strange because I didn't remember setting any restrictions on uploads. After searching online for solutions to 413 errors, I found that in most cases, it was caused by nginx settings. However, after carefully checking the relevant settings, I found that I did not configure anything related to this in nginx. So, I went to the Alist community to look for relevant content and did find some solutions, but none of them worked for me.
However, it wasn't a complete loss. A user mentioned that only files larger than 100MB trigger the 413 error, while other cases do not. I tested it myself and indeed confirmed this. I then desperately searched for any settings where 100 might be involved, but still found nothing. At that moment, I suddenly remembered something. I actually use Alist to automatically back up the data of my website every day, and the website's data is over 100MB. Since the website's data can be uploaded, there's no reason why I couldn't upload my own files, right? So, I compared the situation when backing up the website with uploading my own files and found the biggest difference between the two was that when backing up the website, I accessed Alist using an IP address and port number, while when uploading my own files, I used a subdomain. Could the problem lie here? So, I quickly opened the control panel of Cloudflare and indeed found such a setting in the network options.
Finally, I found the source of the 100MB limit.
Solution#
Now that I know Cloudflare is causing the issue, all I need to do is bypass Cloudflare and directly access Alist using the IP address and port number.