TIL

I know some of these, but most are new to me:

  • ESC-A ESC-a executes the current line, but retains it in the buffer (accept-and-hold)
  • ESC-B ESC-b to move backward one word (backward-word)
  • ESC-C ESC-c to capitalize the current word (capitalize-word)
  • ESC-D ESC-d to kill/erase the current word (kill-word)
  • ESC-E ESC-e unused
  • ESC-F ESC-f to move forward one word (forward-word)
  • ESC-G ESC-g recalls the top line off the buffer stack (get-line)
  • ESC-H ESC-h runs man on the current command (run-help)
  • ESC-I ESC-i unused
  • ESC-J ESC-j unused
  • ESC-K ESC-k unused
  • ESC-L ESC-l to lowercase the current word (down-case-word)
  • ESC-M ESC-m unused
  • ESC-N ESC-n (history-seach-forward)
  • ESC-O ESC-o unused
  • ESC-P ESC-p (history-seach-backward)
  • ESC-Q ESC-q(and^q) clears and pushes the current buffer into the buffer stack (push-line`)
    after executing a command, the old buffer will be loaded in the editing buffer
  • ESC-R ESC-r unused
  • ESC-S ESC-s ESC-$ fixes spelling in the current word (spell-word)
  • ESC-T ESC-t swap the cursor word with the one before it (transpose-words)
  • ESC-U ESC-u to uppercase the current word (up-case-word)
  • ESC-V ESC-v unused
  • ESC-W ESC-w copy the area from cursor to the mark to the kill buffer (copy-region-as-kill)
  • ESC-x prompts you to write a zle command to execute (execute-named-cmd)
  • ESC-y removes the yanked text, and yank the new top (yank-pop)
  • ESC-Z ESC-z reruns the last function ran via execute-named-cmd (execute-last-named-cmd)
  • ESC-' to quote the current line (quote-line)
    puts a ' at the beginning and end of the current command, and escape all current ' characters within the command
  • ESC-" escape everything from the cursor to the mark (quote-region)
    similar to above, but instead of quoting the whole command, it goes from the cursor to the current mark (set via set-mark-command, ^@)
  • ESC-? runs which on the current command (which-command)

Source:

  • [1] `man zshall`
zsh

Just learned this little trick at NeovimConf 2024: instead of execute a command with :!, we can use :%! to fill the current buffer with the command stdout. Note that this replaces the current buffer, it won't append the output or similar.

Source:

By passing -g we to alias, we can define a global alias, which is substituted anywhere on a line (command aliases can only be substituted at the beginning of the line):

$ alias world='WORLD' # standard command alias
$ world     
zsh: command not found: WORLD # looks for the alias, usual behaviour
~/.dotfiles main [127] $ echo world
world # because world is not the first word, it's alias is not triggered here.
$ alias -g world='WORLD' # make it a global alias now
~/.dotfiles main $ echo world
WORLD # boom! `world` is substituted despite not being the first word in the command!

Source:

zsh

A command in the form...

  • =(...) is replaced with the a temporary file path containing its output. The shell will delete the file when the command has run.
  • <(...) is replaced by a pointer to a named pipe (FIFO).

Source:

zsh

Spotted while browsing the sabre/xml repository, git has a built-in function to archive a given repository with various options.

That by itself doesn't seem so interesting, however:

  • can archive the repo from any tree/commit
  • can retrieve remote repositories archives (via --remote=<repo>)
  • respects the .gitattributes files, including export-ignore to not export some files/paths

Source:

git

Imagine using a VPS to host multiple websites, all using docker.

As only one application can bind to the VPS's port 80 at any given time, one way to solve this is to use nginxproxy/nginx-proxy:

services:
  proxy:
    image: nginxproxy/nginx-proxy:....
    ports:
      - "80:80"
    [...]

  website1:
    environment: 
      - VIRTUAL_HOST=domain1.com
    [...]

  website2:
    environment: 
      - VIRTUAL_HOST=sub.domain2.com
    [...]

Drawbacks:

  • This compose file now has configurations for several independent websites that have nothing in common besides the proxy
  • mistakenly running docker compose down shuts down the whole cluster
  • similarly, running docker compose up can be expensive, as we need to launch the whole cluster
  • this file doesn't scale beyond a few (read: 3-4) websites

To address this, we can take advantage of Docker networking, two steps:

  1. define a www network used by the proxy image
# compose.yml just for proxy
services:
  proxy:
    image: nginxproxy/nginx-proxy:....
    ports:
      - "80:80"
    networks: # network(s) the service will be added to
      - www
    [...]

networks:
  www: # define new network here
    name: www # without this property, docker will name the network `<folder-name-hosting-this-compose-file>_www`

# no more services here, besides things like `nginxproxy/acme-companion`
  1. For each website, create its standalone compose file, where the website service uses the same network defined by the proxy:
# compose.yml just for website1
services:
  website1:
    environment: 
      - VIRTUAL_HOST=domain1.com
    networks: # network(s) the service will be added to
      - www
    [...]

networks:
  www:
    external: true # use the external network called `www`

This allows having standalone compose files for each service, while still being able to be discovered by proxy, addressing all drawbacks mentioned above.
As an added bonus, proxy can and will discover new services even after it launched, meaning we don't need to worry to have websites running before the proxy or viceversa (albeit, if the proxy is not running, our websites won't be reachable).

Source: Networking in Compose

When you reference a directory in the form ~ref, the shell assumes that you want the directory displayed in this form; thus simply typing echo ~ref or cd ~ref causes the prompt to be shortened:

/usr/princeton/common/src/news/nntp/inews % inews=$PWD
/usr/princeton/common/src/news/nntp/inews % : ~inews
~inews % echo "wow!"
wow!
~inews % unset inews
/usr/princeton/common/src/news/nntp/inews %

This is so handy, the authors of An Introduction to the Z Shell made it a one liner:

% namedir() { eval $1=\$PWD ; : ~$1 }

Source: An Introduction to the Z Shell

zsh

Like ., : works more or less like true. Unlike ., : is a more portable version of true as : is a special POSIX built-in (read: required), whereas true is a regular built-in (read: optional).

Nowadays both : and source (from now on I will use just :) are used in very clever ways:

  • Persisting environments (only true for : but not in zsh):
$ unset x; ( x=hi :; echo "$x" )
hi
$ unset x; ( x=hi true; echo "$x" )
 # hi no printed
  • Command logging:
% set -x # set -x, makes the shell print out the command before running it
% : Logging message here
% example_command
  • Cron job titles

This usage is similar to above, clever!

45 10 * * * : Backup for database ; /opt/backup.sh
  • Skipping block of codes:
: << 'SKIP'

your code block here

SKIP

Reference: What is the purpose of the : (colon) GNU Bash builtin?

You can use ls pattern(<qualifiers>) for listing files matching the specified qualifiers:

% ls *(@) # match only symbolic links
% ls *(x) # match all executables executable by the owner, equivalent to ls *(*)
% ls *(x) # note: capital X, matches files executable by others
% ls *(R) # match all readable files by us
% ls *(W) # match all writable files by us
% ls *(W^@) # filter symbolic links
% ls *(U) # matches all files owned by us
% ls *(.) # match plan files

And more!

Source: An Introduction to the Z Shell

zsh

unbuffer disables cli tools output buffering, which occurs when program output is redirected from non-interactive programs.

A cool aspect of this is that it lets you see the actual output, including color codes, of any command:

$ ls --color=auto
folder1 folder2 file1 file2 # this output should have the colors based on your LS_COLORS and similar
$ unbuffer ls --color=auto > out
$ cat out
folder1 folder2 file1 file2 # still colored output
$ less out
^[[0m^[[01;34mfolder1^[[0m  ^[[01;34mfolder2^[[0m file1 file2 # raw output, including color codes!

Source:

Part of the kernel crate, a ScopeGuard is a wrapper around a function pointer which calls said function whenever the ScopeGuard instance stops being referenced.

It's useful for example to handle cleanups when a function has more complicated cleanup than just freeing memory or unlocking locks.
The user can call disarm on the ScopeGuard instance to prevent it from firing:

// Create an anonymous function to perform cleanup
let guard = ScopeGuard::new(|| {
  ... // Cleanup code
});

// --snip--

if some_condition() {
  return; // guard goes out of scope, runs cleanup code
}

// --snip--

// When the function exits normally, the guard can be disarmed
guard.disarm();
return;

It's similar to defer in other languages, with the addition of being able to skip the defer closure execution if needed.

There's also a standalone scopeguard crate which accomplishes a similar behavior.

Source:

Stumbled upon these by mistake in normal mode:

  • # moves the cursor to the previous occurrence of the currently hovered word
  • * does the same for the next occurrence

These are part of nvim's search commands, for more :h *, :h #, or :h search-commands

These are has handy as [til58].

These are default shortcuts that let you move between window spaces in Darwin.

As a bonus, you can use one of the many utilities like yabai and skhd to add extra shortcuts like ⌃⌥← and ⌃⌥→ to also move the currently focused window along with you to the previous/next space.

..let's see if adding them here helps me remembering them.

In Rust you don't have to define all struct fields explicitly, for example:

struct MyBox<T>(T);

impl<T> MyBox<T> {
    fn new(x: T) -> MyBox<T> {
        MyBox(x)
    }
}

MyBox contains an instance of generic type T, but how can I refer to it? We haven't declared an explicit property for it! The trick is to pay attention to that (T) in the struct definition, which essentially means we're operating with a tuple. This kind of definitions is known in Rust as Tuple Structs.

In such cases, Rust allows 0-based indexing to access fields within a struct, similarly to tuples, therefore the T instance will be available by accessing self.0.

A more traditional way to declare this struct with an explicit name would be:

struct MyBox<T> {
    x: T,
}

impl<T> MyBox<T> {
    fn new(x: T) -> MyBox<T> {
        MyBox{ x }
    }
}

Three tildes (~~~) is an alternative to three backticks (```) for (extended) markdown's fenced code blocks.

Bonus:
depending on the editor, you can also use four tides/backticks delimiters if you need to show three tides/backticks in the codeblock

This command lets you debug if a certain file/path is ignored and, with the -v flag, it also shows you the matching exclude pattern (Docs).

Originally posted in OOZOU/til

As they're value types, it never crossed my mind that they could go to the heap, but thinking about it just makes sense. The compiler rule is quite simple: All data stored on the stack must have a known, fixed size.

Therefore, data with an unknown size at compile time or a size that might change must be stored on the heap instead.

In the case for arrays for example, on the stack we put:

  • A reference to the actual storage location (which will be on the heap)
  • The array capacity and count properties
  • Some metadata for the array type

This has been annoying me forever...my new maps for vim-floaterm:

local opts = { noremap = true, silent = true }
-- Show term when in normal mode
vim.keymap.set('n', '<C-t>', ':FloatermToggle<CR>', opts)
-- Dismiss term when in term mode
vim.keymap.set('t', '<C-t>', '<C-\\><C-n>:FloatermToggle<CR>', opts)
seq

Just learned about this tiny utility while reading yes man page, seq lets you output sequences of numbers between two values, where you can also customize things like the increment and how each value is separated and more.

$ seq 5 1
5
4
3
2
1
$ seq 1 0.5 4
1
1.5
2
2.5
3
3.5
4

Found it after installing rust via rustup-init:

$ rust-init
[...]
Rust is installed now. Great!

To get started you may need to restart your current shell.
This would reload your PATH environment variable to include
Cargo's bin directory ($HOME/.cargo/bin).

To configure your current shell, you need to source
the corresponding env file under $HOME/.cargo.

This is usually done by running one of the following (note the leading DOT):
. "$HOME/.cargo/env"            # For sh/bash/zsh/ash/dash/pdksh
source "$HOME/.cargo/env.fish"  # For fish

Not only we can use . instead of source, but source is more portable as it's a POSIX standard.

Reference
Bash docs

Discovered while reading through lapce's codebase, this variable collects all MakeFile paths we have included so far in the current makefile (including self).

This is used by several people to autogenerate a help action in makefiles, here's the one used in lapce:

help: ## Print this help message
	@grep -E '^[a-zA-Z._-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'

Reference

In order to apply a Tailwind style to all elements without an explicit class=".." attribute, declare an extension to the base layer in tailwind's input file:

@tailwind base;
@tailwind components;
@tailwind utilities;

@layer base {
  a {
    @apply text-teal-500 underline font-medium;
  }
}

Reference

Newer Older