Linux might not be the biggest tool in my toolbox, but as a seasoned swinger, Makefiles add the missing spark to “make” things happen.
Yes, I’ll host the source code on GitHub. I could consider mirroring it on Sourcehut if there’s enough interest, but I prefer the PR and Issues workflow on GitHub for collaboration. Plus, more people tend to have GitHub accounts than GitLab or Sourcehut, which makes it easier for contributors.
I get the concern about Microsoft, and while I’m not a fan of the company, GitHub has advantages that are hard to beat, especially for community reach. As for OpenAI potentially using the code, personally I don’t mind if my own code gets used for AI training.
I’ll be using an MIT license, in case you're curious. Everyone is free to mirror it anywhere.
An existing FOSS time tracking software I like is Timewarrior (CLI)
It's Exclidraw (dark mode)
Python
Totally understand your perspective, and I’m not here to push back against it. You’ve got a valid point.
I’ll just add that there are already commercial tools that do similar things to what I’m building. It’s interesting to consider how perceptions might shift if a tool were released by a company rather than a solo developer. Sometimes the context influences how a tool is interpreted, even if the underlying functionality remains the same. For what it’s worth, I have no commercial intent behind this.
Exactly! My tool is designed to work with existing time-tracking tools by processing their output. You can think of it as a post-processor that helps clean up and format the data.
Since there are already plenty of time-tracking tools out there (both CLI and GUI), I wanted something that could act as a flexible add-on for them.
Hey, thanks for the comment. I get that it might be used for something shady, but that’s not the intention. The primary goal is to clean up raw time-tracking data into a format that’s easy to present to clients or supervisors, especially for contexts when small gaps or irregularities should be absent.
I imagine most professionals aren’t expected to account for every single minute of their workday. For example, if you’re switching tasks or taking short breaks. It’s more about reporting general productivity or overall progression of tasks, not trying to inflate hours.
Anyone aiming for 'time fraud' could probably find easier methods. My focus is to make life easier for people who already track their work but want cleaner, more digestible reports.
Appreciate the feedback though, helps me make sure the use case is clear! :)
It's almost done (it would take one or two weeks to clean it up for FOSS release). It's a CLI tool. It works great for my use case, but I'm wondering if there's any interest in a tool like this.
Say you have a simple time-tracking tool that tracks what you do daily. The only problem is that there are gaps and whatnot, which might not look nice if you need to send it to someone else. This tool fixes pretty much all of that.
Main format is a JSON with a "description", and either "duration" or a "start"/"end" pair. It supports the Timewarrior format out of the box (CLI Time tracking tool).
Either self-hosted or cloud, I assume many of you keep a server around for personal things. And I'm curious about the cool stuff you've got running on your personal servers.
What services do you host? Any unique stuff? Do you interact with it through ssh, termux, web server?
In a revelation that has sent shockwaves through the tech and math communities, Microsoft has announced plans to copyright the constant pi…
And what's your workflow when working with lots of files in projects with fish?
Hey,
As an avid CLI user, I always aimed to master non-interactive tools to perform most of my work, given that they are easy to use, create, extend, and connect.
However, I found myself dealing with software projects with many files (mostly under the yoke of corporate oppression; an ordeal which I endure to sustain myself, as most of those reading me do, and therefore I will not go further into this topic) and started to hit the limits of non-interactive tools to find and edit files. Indeed, I could go faster if I followed the temptation of monstrous IDEs, as I did in my innocent past.
I did not despair, as naturally I heard of the usefulness of interactive fuzzy finders such as fzf. After spending an afternoon evaluating the tool, I concluded that it indeed increases the complexity of my workflow. Still, this complexity is managed in a sensible way that follows the UNIX tradition.
I now ask you two general questions:
- Did you reach similar conclusions to me and decide to use interactive fuzzy finders to solve working on software projects with many files?
- If you use fzf or similar tools, what can you tell me about your workflow? Any other third-party tools? Do you integrate it into your scripts? Any advice that you can give me out of a long time of experience using the tool that is not easily conveyed by the documentation?
I also ask this very specific question:
- The one part of fzf which I found missing was a way to interact with the results of grep, and to automatically place the selected file(s) in the prompt or an editor. For that, I created the following two commands. Do you have a similar workflow when you want to bring the speed of fuzzy finding to grep?
```bash #! /usr/bin/env bash
gf: grep + fzf
basically a wrapper for 'grep <ARGS> | fzf | cut -f 1 -d:'
print usage on -h/--help
if [[ "$1" == "-h" || "$1" == "--help" ]]; then
echo "Usage: gf <grep-args>"
echo
echo "~~~ that feel when no 'gf' ~~~"
echo
echo "- Basically a wrapper for 'grep <ARGS> | fzf | cut -f 1 -d:'"
echo "- Opens fzf with grep results, and prints the selected filename(s)"
echo "- Note: As this is meant to search files, it already adds the -r flag"
echo
echo "Example:"
echo " $ nvim \gf foobar\
"
echo " $ gf foobar | xargs nvim"
exit 0
fi
run grep with arguments, pipe to fzf, and print the filename(s) selected
custom_grep () { grep -E --color=always --binary-files=without-match --recursive "$@" } remove_color () { sed -E 's/\x1b\[[0-9;]*[mK]//g' } custom_fzf () { fzf --ansi --height ~98% } grep_output=$(custom_grep "$@") if [[ "$?" -ne 0 ]]; then exit 1 else echo "$grep_output" | custom_fzf | remove_color | cut -f 1 -d: fi ```
```bash #! /usr/bin/env bash
ge: grep + fzf + editor
basically a wrapper for 'grep <ARGS> | fzf | cut -f 1 -d: | $EDITOR'
print usage on -h/--help
if [[ "$1" == "-h" || "$1" == "--help" ]]; then echo "Usage: ge <grep-args>" echo echo "- Basically a wrapper for 'grep <ARGS> | fzf | cut -f 1 -d: | \$EDITOR'" echo "- Opens fzf with grep results, and edits the selected file(s)" echo "- Note: As this is meant to search files, it already adds the -r flag" echo "- Note: Internally, it uses the 'gf' command" echo echo "Example:" echo " $ ge foobar" exit 0 fi
takes output from 'gf' and opens it in $EDITOR
grep_fzf_output=$(gf "$@") if [[ -n "$grep_fzf_output" ]]; then $EDITOR "$grep_fzf_output" fi ```
Have a wonderful day, you CLI cowboys.
Unlike a password manager that just logs you in, Beachpatrol can run any automation task, like checking your email, downloading files, or filling out forms. You have to create Playwright scripts for these tasks and run them from a shell command. There is an example script already in the commands folder, which you can run with the command beackmsg smoke-test
. The sky is the limit, basically.
Cool project! I'll check it out.
Regarding userscripting, from the F.A.Q.:
Why use an external automation tool (Playwright) instead of a browser extension?
While Beachpatrol allows to control the browser from both the OS and from a browser extension, our priority was the OS. Therefore, something like Playwright was the natural choice.
Furthermore, while controlling the browser from an extensions is possible, Manifest v3 removed the ability to execute third-party strings of code. Popular automation extensions like Greasemonkey and Tampermonkey could also be affected by Manifest v3. The alternative is to embed the code into the extension, but that would requires re-bundling the extensions after every change. Other tricks do exist to make this approach work, and there is some hope for future Manifest v3 solutions, but this path is certainly tricky.
It is more likely that Selenium and related tools will continue to work in the foreseeable future given the business demand for traditional browser testing.
A CLI tool to replace and automate your everyday web browser. - sebastiancarlos/beachpatrol
SWABAI (Wrapper for the Sway/i3/Yabai tiling window managers) - sebastiancarlos/swabai
Makes sense and you're probably right, but I'll tell you why I didn't do it that way:
- I just did what came first to me
- I like the idea of the API defining the project structure
- When adding a new package manager, if that ever happens, I would like to see all other implementation of the same functionality on the same file, for help and inspiration
Tbh these scripts are for my personal use, written in the way that makes sense for me. I only open sourced it as a joke an as an example of how reinventing your own wheel is not that hard sometimes, and comes with the benefit of doing just what you need it to do.
Actually I was thinking of adding a sysget fallback, as I might need to do some debian/fedora hacking soon.
PM-JESUS: "Your own, package-manager, Jesus" 🎶 (Package Manager front-end) - sebastiancarlos/pm-jesus
YAS-QWIN (Yet Another SQL-Query Writing Interface) - GitHub - sebastiancarlos/yas-qwin: YAS-QWIN (Yet Another SQL-Query Writing Interface)
Will do, bossman
It should be pretty soon. I've got it working already, but I need to test it more and figure out how Firefox profiles work with Playwright.
If you want you can just clone it and replace "chromium" with "firefox". It should just work, and it shouldn't take too long to figure out the rest.
"Currently only Chromium is supported. Other Chromium-based browsers and Firefox support to be added soon."
A CLI tool meant to replace and automate your everyday web browser. - GitHub - sebastiancarlos/beachpatrol: A CLI tool meant to replace and automate your everyday web browser.
Photo by Jigar Panchal on Unsplash
In a stunning turn of events, Microsoft has developed a device that experts are calling the “death ray.”
Sway-Talisman: Terminal Application Launcher in Scratchpad, Minimalist And Native - GitHub - sebastiancarlos/sway-talisman: Sway-Talisman: Terminal Application Launcher in Scratchpad, Minimalist An...
Hey, I really appreciate that. I'm glad you find it useful.
I got you, boss man. Enjoy the raw speed
Sway-MÜSLI: Sway - Minimal Ültrafast Status LIne. Contribute to sebastiancarlos/sway-musli development by creating an account on GitHub.
Point taken, but Big Tech systematically does equally bad things while disguising them behind DevRel, so I think it's justified to poke fun at that.
Just curious, what would be a correct translation?
The joke is that it's hard to tell if this is a joke because the lines between good intentions, corporate jargon, and feasibility have been blurred beyond recognition both here and in the real world.
It's also funny that after all these years, i18n is still a mess. Moreover, even if translations are standard in GUIs and documentation, for some reason, everyone is okay with defaulting to English for the oldest form of computer interaction.
Also, the joke is whatever you want it to be. Follow your dreams.
YouTube Video
Click to view this content.
``` #!/usr/bin/env bash
runasm - Assemble, link, and run multiple assembly files, then delete them.
if [[ $# -eq 0 ]]; then echo "Usage: runasm [ ...]" echo " - Assemble, link, and run multiple assembly files, then delete them." echo " - Name of executable is the name of the first file without extension." exit 1 fi
object_files=() executable_file=${1%.*}
for assembly_file in "$@"; do # Avengers, assemble! object_file="${assembly_file%.*}.o" as "${assembly_file}" -o "${object_file}" if [[ $? -ne 0 ]]; then exit 1 fi object_files+=("${object_file}") done
Link
ld "${object_files[@]}" -o "${executable_file}" if [[ $? -ne 0 ]]; then exit 1 fi
Run, remove created files, and return exit code
./"${executable_file}" exit_code=$? rm "${object_files[@]}" "${executable_file}" > /dev/null 2>&1 exit "${exit_code}" ```