Compare commits

...

2 Commits

Author SHA1 Message Date
BraydonKains 3237684408 fixed typos 2 months ago
BraydonKains f204f22bf6 add Talks page 2 months ago

@ -65,11 +65,11 @@ Yeah, sort of. I am being pretty "doom and gloom" on purpose to frame this cruci
I should preface this section by clarifying that I'm relatively new to this problem at scale, and there are experts far smarter than myself working to solve and educate on these problems. I'd still like to close this article off with tips that I have for developers who are worried about how to protect themselves further in the future.
I imagine there are a number of you reading this who are upset that I am appearing to suggest every line of open source code you pull down be audited. We all know that's really not feasible at any scale larger than "demo". This is why there are so many tools, such as [sonarqube](https://www.sonarqube.org/), [snyk](https://snyk.io/), and every project's most diligent contributor [dependabot](https://github.com/features/security) (I am not affiliated with any, just a few I'm familiar with) built with features to track and audit dependencies you've brought in that may contain vulnerabilities. However, these tools don't necessary help if you've accidentally pulled in a bad dependency during development.
I imagine there are a number of you reading this who are upset that I am appearing to suggest every line of open source code you pull down be audited. We all know that's really not feasible at any scale larger than "demo". This is why there are so many tools, such as [sonarqube](https://www.sonarqube.org/), [snyk](https://snyk.io/), and every project's most diligent contributor [dependabot](https://github.com/features/security) (I am not affiliated with any, just a few I'm familiar with) built with features to track and audit dependencies you've brought in that may contain vulnerabilities. However, these tools don't necessarily help if you've accidentally pulled in a bad dependency during development.
When a package is published on npm, save for select circumstances, the version published is there in perpetuity unless npm decides to take it down. Even though `faker@6.6.6` which essentially deletes all of its code is published on npm, it does not remove the history of `faker` releases. Code can only be [unpublished from the registry](https://docs.npmjs.com/unpublishing-packages-from-the-registry) if the package has no dependents, which `faker` had a number of. In this case, and the case of `colors@1.4.44-liberty-2`, npm provides the tools to protect against these releases if you are a direct dependent.
If you are a newer developer, I recommend understanding [semantic versioning](https://semver.org/) as fully as you can; it is one of the greatest defenses to much of what I've mentioned in this article. The most common practice when using semantic versioning is to use the `^` caret prefix on most of your dependencies because this is what npm does by default when installing a new dependency. It means that any updates to the major version will not be installed, but the latest release of that major version will be used. Similarly, there is the `~` tilde prefix is similar, which will not allow any updates to the minor version. Providing no prefix will pin a dependency at a particular version. If you aren't already, it is highly recommended to use more discretion when deciding which prefix to use on new and existing dependencies you choose to bring in.
An important caveat here is that even people who were more reserved by only allowing patch releases of `colors`, which should suggest only bringing in bug/vulnerability fixes, still got screwed here by unexpectedly allowing a *very* breaking change. However, this defense is still good against typical, benign cases.
An important caveat here is that even people who were more reserved by only allowing patch releases of `colors`, which should suggest only bringing in bug/vulnerability fixes, still got screwed here by unexpectedly allowing a *very* breaking change. However, this defense is still good against typical benign cases.
The issue with increased discretion is it usually means you have to do more manual work when it's time to update. Working in the Node.js ecosystem is implicitly accepting that everything moves incredibly fast, and it's a danger to your application's continued health to let things fall too far out of date. While far from a perfect solution, one of my favourite ways to combat this is [`npm-check-updates`](https://www.npmjs.com/package/npm-check-updates). It provides an optional interactive environment to select updates to packages that you feel confident are safe. It is a nice convenience in a process that haunts Node.js developers everywhere.
@ -77,4 +77,4 @@ The sad truth is that there probably is no true way to stop this from affecting
# Conclusion
I think what happened with `colors` and `faker` is a fascinating case study into how many of us have become complacent with npm's hidden safety concerns. I love open source software, and I believe we can all do out part to ensure we use it safely. I hope this article provided a new perspective to the situation, and whether you agree or disagree feel free to reach out and discuss! I am interested to hear about your experiences.
I think what happened with `colors` and `faker` is a fascinating case study into how many of us have become complacent with npm's hidden safety concerns. I love open source software, and I believe we can all do our part to ensure we use it safely. I hope this article provided a new perspective to the situation, and whether you agree or disagree feel free to reach out and discuss! I am interested to hear about your experiences.

@ -4,7 +4,7 @@ date = "2021-03-13"
author = "Braydon Kains"
+++
NOTE (Feb 15, 2025): I think this post kinda sucks and I largely disagree with a majority of it now. I've decided to keep it here as posterity, but my modern sensibilities no longer line up with what I wrote here.
NOTE (Feb 15, 2025): I think this post kinda sucks and I largely disagree with a majority of it now. I've decided to keep it here for posterity, but my modern sensibilities no longer line up with what I wrote here.
---

@ -193,7 +193,7 @@ That explains how we're getting program counter `0x0`!
### The Solution
While I spent a considerable amount of time experimenting and looking through `go tool linker` and `cgo` source code to try and understand what was going on, and I did learn a lot, I ended up finding the problem with a good old fashioned `git bisect`. I ended up at commit [1f29f39](https://github.com/golang/go/commit/1f29f39795e736238200840c368c4e0c6edbfbae).
The message of that commit: `cmd/link: don't export all symbols for ELF external linking`
The message of that commit: `cmd/link: don't export all symbols for ELF external linking`
The problematic code change was from this:
```go
// Force global symbols to be exported for dlopen, etc.
@ -214,9 +214,9 @@ if ctxt.IsELF {
}
}
```
What does this mean? The code used to always pass the `-rdynamic` flag to `gcc`, which passes `--export-dynamic` to `ld` under the hood. The change for the code changed to only pass `-rdynamic` to `gcc` if the particular linker flag is not supported. The justification for this is in [this issue](https://github.com/golang/go/issues/53579) (TL;DR because this is unnecessary in most cases it wastes space on a majority of binaries). While it's hard to know exactly when the `--export-dynamic-symbol` flag was added to `ld`, but it seems like the only plausible reason that this issue only occurs on an `ld` version that is high enough.
What does this mean? The code used to always pass the `-rdynamic` flag to `gcc`, which passes `--export-dynamic` to `ld` under the hood. The change for the code changed to only pass `-rdynamic` to `gcc` if the particular linker flag is not supported. The justification for this is in [this issue](https://github.com/golang/go/issues/53579) (TL;DR it's because this is unnecessary in most cases and thus wastes space on a majority of binaries). While it's hard to know exactly when the `--export-dynamic-symbol` flag was added to `ld`, it seems like the only plausible reason that this issue only occurs on an `ld` version that is high enough.
Since `-rdynamic` is now not always being passed in the CGO build process, the change I ended up on was to modify the binding generation in `go-nvml` to [always pass the `--export-dynamic` linker flag](https://github.com/NVIDIA/go-nvml/pull/79). This doesn't break if the `-rdynamic` flag is passed, but ensures that we still have the required `ld` flag being passed in newer versions of Go.
Since `-rdynamic` is now not always being passed in the CGO build process, the change I ended up on was to modify the binding generation in `go-nvml` to [always pass the `--export-dynamic` linker flag](https://github.com/NVIDIA/go-nvml/pull/79). This doesn't break if the `-rdynamic` flag is passed, but ensures that we still have the required `ld` flag being passed in newer versions of Go and `ld`.
## Conclusion

@ -10,8 +10,9 @@ If you pull up a software dev job posting and check the requirements, there is a
# My Bachelor's Degree
During my undergrad, I hated pretty much everything about school. I knew I loved Computer Science, and I was utterly committed to completing my degree, but I barely made. The system really felt like it was a carefully designed torture chamber made just for me to pay thousands of dollars to suffer in.
I loved so many of the concepts and subjects I was learning in Computer Science and Math. However, particularly for Math, the shift in priorities coming to University were a shell shock. The goal of these classes didn't feel like learning anymore; they felt like a game to achieve the best mark. It was a game I sucked at. I don't think there's any words to effectively describe how bad I was at exams. I never properly learned how to cope with my intense distractibility and struggle to focus throughout school, and my memory for concepts I didn't deeply understand was incredibly fragile. My success in any courses, even Computer Science ones, hinged almost entirely on what percentage of the mark was derived from exams. Even worse were the courses that required you to pass the final exam to pass the course (this is the worst of the "torture chamber designed for me"). I failed 2 classes, both through final exams alone: Object-Oriented Programming (which I had done extensively even then, but croaked writing Java on paper), and Probability (which terminated my then-burgeoning interest in Data Science).
I wouldn't think so much about the good old torture chamber now, considering I'm years removed from receiving my degree, but I am constantly upset on reminder of what university could have been for me. My passion has shifted from just Software Development to deep Computer Science. My fiancée loves to poke fun at me for reading basically only CS textbooks, and my spare time is often spent learning about increasingly deep computer science. While I was in school, all I could feel was the stress-rage-hybrid of looming exams, and the intense desire to be free and get what I really wanted; a job as a software developer. I'm so grateful to have achieved that goal, but I still can't help but think about how different my life would have been if I hadn't had to go through something I was so bad at to get there.
I loved so many of the concepts and subjects I was learning in Computer Science and Math. However, particularly for Math, the shift in priorities coming to University were a shell shock. The goal of these classes didn't feel like learning anymore; they felt like a game to achieve the best mark. It was a game I sucked at. I don't think there's any words to effectively describe how bad I was at exams. I never properly learned how to cope with my intense distractibility and struggle to focus throughout school, and my memory for concepts I didn't deeply understand was incredibly fragile. My success in any courses, even Computer Science ones, hinged almost entirely on what percentage of the mark was derived from exams. Even worse were the courses that required you to pass the final exam to pass the course (this is the worst of the "torture chamber designed for me"). I failed 2 classes, both through final exams alone: Object-Oriented Programming (which I had done extensively even then, but choked writing Java on paper), and Probability (which terminated my then-burgeoning interest in Data Science).
I wouldn't think so much about the good old torture chamber now, considering I'm years removed from receiving my degree, but I am constantly upset on reminder of what university could have been for me. My passion has shifted from just Software Development to deep Computer Science. My wife loves to poke fun at me for reading basically only CS textbooks, and my spare time is often spent learning about increasingly deep computer science concepts. While I was in school, all I could feel was the stress-rage-hybrid of looming exams, and the intense desire to be free and get what I really wanted; a job as a software developer. I'm so grateful to have achieved that goal, but I still can't help but think about how different my life would have been if I hadn't had to go through something I was so bad at to get there.
Believe it or not, this article is not just for me to complain how much I hated university and exams (although it was cathartic to write, and I'm leaving it in). Despite how much I hated it, I really can't blame University for being... University. The validity of the post-secondary system isn't really the dialogue I'm going for (at least today). The real point here is that University was simply not for me.

@ -0,0 +1,82 @@
+++
title = "Talks"
menu = "main"
+++
This is a collection of public recorded talks I've done.
# Prepared Talks
## Deep Dive: How Fluent Bit Collects File Logs
https://www.youtube.com/watch?v=KrlvWBCGagI
This is my talk for Observability Day North America, a co-located event with KubeCon NA 2024. It was a lightning talk, but it ended up being a really dense talk and probably could have been full-sized. To compensate, I talked really fast!
T-shirt: Iron Maiden
## Tuning OTel Collector Performance Through Profiling
https://www.youtube.com/watch?v=qMxxjB4meXo
This was a talk for OpenTelemetry Community Day 2024. It goes through my experience profiling parts of the OpenTelemetry Collector to find performance improvements.
Retractions: One of the solutions I talked about in this talk for Windows getting Parent Process ID had a flawed premise, ignoring the fact that the increase in WMI memory usage did offset the gains made in the Collector. So it ended up not being that big of a win, and we're still working to find another alternate method for getting Parent Process ID.
T-shirt: Brook from One Piece Wanted Poster
## How Much Overhead: How to Evaluate Observability Agent Performance
https://www.youtube.com/watch?v=BIaftvtFPHg
This is my talk for Observability Day 2023, a co-located event with KubeCon NA 2023. It was inspired by situations at work where people would ask things like "which agent has less overhead?" without fully qualifying their goals. I wanted to break down the problem down into more actionable pieces.
T-shirt: Meshuggah Catch-33
## Learning To Fly: How to Find Bottlenecks in your Agents
https://www.youtube.com/watch?v=jf7t1CpoKlg&t=176s
This was a remote talk I did for the [Is It Observable](https://www.youtube.com/@isitobservable) YouTube channel (awesome channel, highly recommend subscribing). This was perhaps the hardest I ever prepared for a talk, because it came with an in-depth [reproducible demo](https://github.com/braydonk/learning-to-fly-lightning-talk), that ran in a Dockerfile and included code to graph OpenTelemetry Metrics directly in the CLI. It was a lot of fun to prepare and I think it's one of my best talks. If you can get around the fact that my mic sounded TERRIBLE).
T-shirt: Zoro from One Piece
Background friend: A plush of the character Acrid from my favourite video game, Risk of Rain 2
# Tutorials
## 5 Levels of Go Error Handling
https://www.youtube.com/watch?v=y5utZCeHys0&t=1s
This was my one attempt at "content creation". It's a relatively beginner-focused tutorial about Go error handling and how to do some more advanced things. The video was picked up by the algorithm this past summer and started getting a lot more attention. I'm not completely cutting myself off from making more videos in the future, but I did not have as much fun as I thought I would making this video so I'm not sure if I'll make more. I think this tutorial is pretty good for what it is though and I'll keep it around anyway!
T-shirt: It's obscured!
Background friend: Acrid from Risk of Rain 2 again
# Interviews
## KubeCon NA 2024 with Is It Observable
https://www.youtube.com/watch?v=qf0OjAEzprs&t=365s
This was an interview with the [Is It Observable](https://www.youtube.com/@isitobservable) YouTube channel. I talked a bit about the talk I was giving the next day at Observability Day, as well as some general best practices for managing performance of agents collecting logs.
T-shirt: Coheed and Cambria, Vaxis II tour shirt
## Humans of OTel - KubeCon NA 2024
https://www.youtube.com/watch?v=TIMgKXCeiyQ
I was featured in the Humans of OTel series of interviews at KubeCon NA 2024, along with a lot of amazing peers from the OpenTelemetry Community!
T-shirt: Video filmed too high to tell!
## KubeCon NA 2023 with Is It Observable
https://www.youtube.com/watch?v=5arixRhAIbs&t=161s
An interview with [Is It Observable](https://www.youtube.com/@isitobservable) from KubeCon NA 2023. This was my first time doing something like this so I was definitely more nervous, but it was great practice!
T-shirt: Meshuggah Catch 33
Loading…
Cancel
Save