Skip Navigation
Software Engineering Documentation Best Practice?
  • Does anyone have any good sources or suggestions on how I could look to try and begin to improve documentation within my team?

    Documentation in software projecte, more often than not, is a huge waste of time and resources.

    If you expect your docs to go too much into detail, they will quickly become obsolete and dissociated from the actual project. You will need to waste a lot of work keeping them in sync with the project, with little to no benefit at all.

    If you expect your docs to stick with high-level descriptions and overviews, they quickly lose relevance and become useless after you onboard to a project.

    If you expect your docs to document usecases, you're doing it wrong. That's the job of automated test suites.

    The hard truth is that the only people who think they benefit from documentation are junior devs just starting out their career. Their need for docs is a proxy for the challenges they face reading the source code and understanding how the technology is being used and how things work and are expected to work. Once they go through onboarding, documentation quickly vanishes from their concerns.

    Nowadays software is self-documenting with combination of three tools: the software projects themselves, version control systems, and ticketing systems. A PR shows you what code changes were involved in implementing a feature/fixing a bug, the commit logs touching some component tells you how that component can and does change, and ticketing shows you the motivation and the context for some changes. Automated test suites track the conditions the software must meet and which the development team feels must be ensured in order for the software to work. The higher you are in the testing pyramid, the closer you are to document usecases.

    If you care about improving your team's ability to document their work, you focus on ticketing, commit etiquette, automated tests, and writing clean code.

  • Practical C++17: Loop Unrolling with Lambdas and Fold Expressions
    www.cppstories.com Practical C++17: Loop Unrolling with Lambdas and Fold Expressions

    In this blog post, we’ll delve into the unroll<N>() template function for template unrolling, understand its mechanics, and see how it can improve your code. We’ll look at lambdas, fold expressions, and integer sequences. Let’s get started! A little background   In a recent article Vector math libra...

    Practical C++17: Loop Unrolling with Lambdas and Fold Expressions
    0
    Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    Large Text Compression Benchmark
    1
    The empire of C++ strikes back with Safe C++ blueprint
  • The only (arguably*) baseless claim in that quote is this part:

    You do understand you're making that claim on the post discussing the proposal of Safe C++ ?

    And to underline the absurdity of your claim, would you argue that it's impossible to write a"hello, world" program in C++ that's not memory-safe? From that point onward, what would it take to make it violate any memory constraints? Are those things avoidable? Think about it for a second.

  • The HTTP QUERY Method
  • Custom methods won't have the benefit of being dealt with as if they shared specific semantics, such as being treated as safe methods or idempotent, but ultimately that's just an expected trait that anyone can work with.

    In the end, specifying a new standard HTTP method like QUERY extends some very specific assurances regarding semantics, such as whether frameworks should enforce CRSF tokens based on whether a QUERY has the semantics of a safe method or not.

  • The empire of C++ strikes back with Safe C++ blueprint
  • If you could reliably write memory safe code in C++, why do devs put memory safety issues intontheir code bases then?

    That's a question you can ask to the guys promoting the adoption of languages marketed based on memory safety arguments. I mean, even Rust has a fair share of CVEs whose root cause is unsafe memory management.

  • The empire of C++ strikes back with Safe C++ blueprint
  • The problem with C++ is it still allows a lot of unsafe ways of working with memory that previous projects used and people still use now.

    Why do you think this is a problem? We have a tool that gives everyone the freedom to manage resources the way it suits their own needs. It even went as far as explicitly supporting garbage collectors right up to C++23. Some frameworks adopted and enforced their own memory management systems, such as Qt.

    Tell me, exactly why do you think this is a problem?

  • The empire of C++ strikes back with Safe C++ blueprint
  • From the article.

    Josh Aas, co-founder and executive director of the Internet Security Research Group (ISRG), which oversees a memory safety initiative called Prossimo, last year told The Register that while it's theoretically possible to write memory-safe C++, that's not happening in real-world scenarios because C++ was not designed from the ground up for memory safety.

    That baseless claim doesn't pass the smell check. Just because a feature was not rolled out in the mid-90s would that mean that it's not available today? Utter nonsense.

    If your paycheck is highly dependent on pushing a specific tool, of course you have a vested interest in diving head-first in a denial pool.

    But cargo cult mentality is here to stay.

  • Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    Dissecting the GZIP format (2011)
    0
    The HTTP QUERY Method
  • However, we’re still implementing IPv6, so how long until we could actually use this?

    We can already use custom verbs as we please: we only need to have clients and servers agree on a contract.

    What we don't have is the benefit of high-level "batteries included" web frameworks doing the work for us.

  • Using Conan as a CMake Dependency Provider
    dominikberner.ch Using Conan as a CMake Dependency Provider

    With the addition of dependency providers in CMake 3.24 using Conan to manage dependencies becomes easier and more integrated. This post shows a step-by-step guide on how to use Conan as a CMake dependency provider.

    Using Conan as a CMake Dependency Provider
    0
    Always support compressed response in an API service
    ashishb.net Always support compressed response in an API service

    If you run any web service always enable support for serving compressed responses. It will save egress bandwidth costs for you. And, more importantly, for your users. Over time, the servers as well as client devices have become more powerful, so, compressing/decompressing data on the fly is cheap.

    0
    The HTTP QUERY Method
    www.ietf.org The HTTP QUERY Method

    This specification defines a new HTTP method, QUERY, as a safe, idempotent request method that can carry request content.

    6
    B-Trees: More Than I Thought I'd Want to Know
  • So that’s where I would say, as long as performance doesn’t matter it’s better to default to B-Tree maps than to hash maps, because the chance of avoiding bugs is more valuable than immeasurable performance benefits (...)

    I don't quite follow. What leads you to believe that a B-Tree map implementation would have a lower chance of having a bug when you can simply pick any standard and readily available hash map implementation?

    Also, you fail to provide any concrete reasoning for b-tree maps. It's not performance on any of the dictionary operationd, and bugs ain't it as well. What's the selling point that you are seeing?

  • B-Trees: More Than I Thought I'd Want to Know
  • the reason I tend to recommend B-Tree maps over hash maps for ordinary programming is consistent iteration order.

    Hash maps tend to be used to take advantage of constant time lookup and insertion, not iterations. Hash maps aren't really suites for that usecase.

    Programming languages tend to provide two standard dictionary containers: a hash map implementation suited for lookups and insertions, and a tree-based hash map that supports sorting elements by key.

  • Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    B-Trees: More Than I Thought I'd Want to Know
    5
    Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    What is GZIP Compression and is it Lossless?
    bunny.net What is GZIP Compression and is it Lossless?

    GZIP Compression is an extremely popular technique of lossless compression for photos, videos, & web pages. It is used by a large number of websites.

    What is GZIP Compression and is it Lossless?
    0
    RFC 7493: The I-JSON Message Format
  • Yeah, the quality on Lemmy is nowhere (...)

    Go ahead and contribute things that you find interesting instead of wasting your time whining about what others might like.

    So far, all you're contributing is whiny shitposting. You can find plenty of that in Reddit too.

  • RFC 7493: The I-JSON Message Format
  • It’s from 2015, so its probably what you are doing anyway

    No, you are probably not using this at all. The problem with JSON is that this details are all handled in an implementation-defined way, and most implementation just fail/round silently.

    Just give it a try and send down the wire a JSON with, say, a huge integer, and see if that triggers a parsing error. For starters, in .NET both Newtonsoft and System.Text.Json set a limit of 64 bits.

    https://learn.microsoft.com/en-us/dotnet/api/system.text.json.jsonserializeroptions.maxdepth

  • RFC 7493: The I-JSON Message Format
  • Why restrict to 54-bit signed integers?

    Because number is a double, and IEEE754 specifies the mantissa of double-precision numbers as 53bits+sign.

    Meaning, it's the highest integer precision that a double-precision object can express.

    I suppose that makes sense for maximum compatibility, but feels gross if we’re already identifying value types.

    It's not about compatibility. It's because JSON only has a number type which covers both floating point and integers, and number is implemented as a double-precision value. If you have to express integers with a double-precision type, when you go beyond 53bits you will start to experience loss of precision, which goes completely against the notion of an integer.

  • Nagle's algorithm - Wikipedia
  • The only think that TCP_NODELAY does is disabling packet batching/merging through Naggle's algorithm. Supposedly that increases throughput by reducing the volume of redundant information required to send small data payloads in individual packets, with the tradeoff of higher latency. It's a tradeoff between latency and throughput. I don't see any reason for transfer rates to lower; quite the opposite. In fact the very few benchmarks I saw showed exactly that: TCP_NODELAY causing a drop in the transfer rate.

    There are also articles on the cargo cult behind TCP_NODELAY.

    But feel free to show your data.

  • Safe C++
  • It’s very hard for “Safe C++” to exist when integer overflow is UB.

    You could simply state you did not read the article and decided to comment out of ignorance.

    If you spent one minute skimming through the article, you would have stumbled upon the section on undefined behavior. Instead, you opted to post ignorant drivel.

  • Safe C++
  • I wouldn’t call bad readability a loaded gun really.

    Bad readability is a problem cause by the developer, not the language. Anyone can crank out unreadable symbol soup in any language, if that's what they want/can deliver.

    Blaming the programming language for the programmer's incompetence is very telling, so telling there's even a saying: A bad workman always blames his tools.

  • Safe C++
  • Well, auto looks just like var in that regard.

    It really isn't. Neither in C# nor in Java. They are just syntactic sugar to avoid redundant type specifications. I mean things like Foo foo = new Foo();. Who gets confused with that?

    Why do you think IDEs are able to tell which type a variable is?

    Even C# takes a step further and allows developer to omit the constructor with their target-typed new expressions. No one is whining about dynamic types just because the language let's you instantiate an object with Foo foo = new();.

  • Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    Nagle's algorithm - Wikipedia
    7
    RFC 7493: The I-JSON Message Format
    datatracker.ietf.org RFC 7493: The I-JSON Message Format

    I-JSON (short for "Internet JSON") is a restricted profile of JSON designed to maximize interoperability and increase confidence that software can process it successfully with predictable results.

    RFC 7493: The I-JSON Message Format
    17
    Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    An introduction to Conflict-Free Replicated Data Types · Part 1: Preliminaries
    0
    Why Copilot is Making Programmers Worse at Programming
  • I think I could have states my opinion better. I think LLMs total value remains to be seen. They allow totally incompetent developers to occasionally pass as below average developers.

    This is a baseless assertion from your end, and a purely personal one.

    My anecdotal evidence is that the best software engineers I know use these tools extensively to get rid of churn and drudge work, and they apply it anywhere and everywhere they can.

  • Data Structures and Algorithms @programming.dev lysdexic @programming.dev
    Conflict Resolution: Using Last-Write-Wins vs. CRDTs (2018)
    dzone.com Conflict Resolution: Using Last-Write-Wins vs. CRDTs - DZone

    Learn about two common techniques for resolving conflicts in your database: last-write-wins (LWW) and conflict-free replicated data types (CRDTs).

    Conflict Resolution: Using Last-Write-Wins vs. CRDTs - DZone
    0
    My Software Bookshelf
    olano.dev My Software Bookshelf

    The easiest way to make a to-read pile grow is to read a book from it.

    My Software Bookshelf
    0
    ExpressJS v5.0.0 released
    github.com Release 5.0.0 · expressjs/express

    What's Changed 4.19.2 Staging by @wesleytodd in #5561 remove duplicate location test for data uri by @wesleytodd in #5562 feat: document beta releases expectations by @marco-ippolito in #5565 Cut ...

    Release 5.0.0 · expressjs/express
    2
    InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)LY
    lysdexic @programming.dev
    Posts 402
    Comments 604