May 23, 2024
This is for me so i am not going to dance about the instructions here.
Docker Compose
nginx:
container_name: nginx
tty: true
restart: always
environment:
TZ: Australia/Hobart
image: uozi/nginx-ui
volumes:
- ./data/certbot/conf:/etc/letsencrypt
- ./data/certbot/www:/var/www/certbot
- /etc/ssl/certs/:/etc/ssl/certs/
- /etc/ssl/private/:/etc/ssl/private/
- './nginx:/etc/nginx'
- './nginx-ui:/etc/nginx-ui'
ports:
- "80:80"
- "443:443"
Nginx Template
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
server {
listen 443 ssl;
server_name pihole.local;
ssl_certificate /etc/ssl/certs/selfsigned.crt;
ssl_certificate_key /etc/ssl/private/selfsigned.key;
location / {
proxy_pass http://pihole:80;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
server {
listen 80;
server_name pihole.local;
return 301 https://$host$request_uri;
}
Now we have that crap out of the way, make sure its launched and working. Everything will be unsecured. So lets fix that
sudo openssl genrsa -out /etc/ssl/private/rootCA.key 2048
sudo openssl req -x509 -new -nodes -key /etc/ssl/private/rootCA.key -sha256 -days 1024 -out /etc/ssl/certs/rootCA.crt -subj "/C=US/ST=California/L=San Francisco/O=MyOrganization/OU=IT Department/CN=MyRootCA"
sudo nano /etc/ssl/private/openssl-san.cnf
The contents for the new conf file:
[ req ]
distinguished_name = req_distinguished_name
req_extensions = v3_req
prompt = no
[ req_distinguished_name ]
CN = your_primary_domain.local
[ v3_req ]
keyUsage = keyEncipherment, dataEncipherment
extendedKeyUsage = serverAuth
subjectAltName = @alt_names
[ alt_names ]
DNS.1 = your_primary_domain.local
DNS.2 = pihole.local
DNS.3 = anotherdomain.local
Primary domain can be the one hosting nginx. So nginx.local
You cannot alter this later so try to anticipate all your domains. Its not the end of the world if you cant, just run these instructions again.
sudo openssl genrsa -out /etc/ssl/private/selfsigned.key 2048
sudo openssl req -new -key /etc/ssl/private/selfsigned.key -out /etc/ssl/private/selfsigned.csr -config /etc/ssl/private/openssl-san.cnf
sudo openssl x509 -req -in /etc/ssl/private/selfsigned.csr -CA /etc/ssl/certs/rootCA.crt -CAkey /etc/ssl/private/rootCA.key -CAcreateserial -out /etc/ssl/certs/selfsigned.crt -days 500 -sha256 -extfile /etc/ssl/private/openssl-san.cnf -extensions v3_req
Thats pretty much it on the server end. You just need to add new nginx configs for each service.
Now, I cant yet speak for chrome, but to get firefox to stop being weird:
-
Download rootCA.crt.
-
Go into Settings > Privacy... > Veiw cert... > Authorities > Import
-
Import the file. Check the website box.
That should make firefox play nice with all the domains in the cert.
That's it! Get on with it.
Apr 17, 2024
I'm not going to mess you around here, this was oddly painful with conflicting information and given I know nothing about the implementation details of SSH, quite irritating:
Follow these steps to enable two-factor authentication (2FA) over SSH using a public key and Google Authenticator on Ubuntu 22.04.4 LTS:
- Update your package lists
- Install the Google Authenticator package:
sudo apt-get install libpam-google-authenticator
- Set up Google Authenticator:
-
Answer the prompts as follows:
-
Do you want authentication tokens to be time-based (y/n)? y
-
Do you want me to update your "~/.google_authenticator" file (y/n)? y
-
Do you want to disallow multiple uses of the same authentication token? This restricts you to one login about every 30s, but it increases your chances to notice or even prevent man-in-the-middle attacks (y/n). y
-
By default, tokens are good for 30 seconds and in order to compensate for possible time-skew between the client and the server, we allow an extra token before and after the current time. If you experience problems with poor time synchronization, you can increase the window from its default size of 1:30min to about 4min. Do you want to do so (y/n)? n
-
If the computer that you are logging into isn't hardened against brute-force login attempts, you can enable rate-limiting for the authentication module. Do you want to enable rate-limiting (y/n)? y
-
Scan the QR code into your authenticator app.
-
Edit the PAM SSHD configuration:
sudo nano /etc/pam.d/sshd
-
Add these lines at the bottom of the file
-
auth required pam_google_authenticator.so nullok
auth required pam_permit.so
-
nullok
allows users to log in without 2FA until they configure their OATH-TOTP token. Remove this option once all users are set up.
Configure SSH for challenge-response authentication:
sudo nano /etc/ssh/sshd_config
- Set
ChallengeResponseAuthentication
to yes
. Update if present, uncomment, or add this line.
Restart the SSH service to apply changes:
sudo systemctl restart sshd.service
Test your configuration in a separate terminal window. If you already use a public key, there should be no noticeable change.
Update SSHD to require 2FA:
sudo nano /etc/ssh/sshd_config
- Add or update the following line to require public key and either password or keyboard-interactive authentication:r
AuthenticationMethods publickey,password publickey,keyboard-interactive
Enable keyboard-interactive authentication:
KbdInteractiveAuthentication yes
<-- This is for 22.04 LTS
ChallengeResponseAuthentication yes
<-- This is for older versions of Linux
Further secure PAM by editing its SSHD file:
sudo nano /etc/pam.d/sshd
- Comment out this line to prevent fallback to password authentication.
#@include common-auth
Restart the SSH service once more to finalize all settings:
sudo systemctl restart sshd.service
These steps will enable you to securely access your server using two-factor authentication while maintaining the flexibility of public key authentication.
Feb 19, 2024
In the early 1990s, a groundbreaking video game called Doom not only redefined the landscape of gaming but also sparked a vision for the future of technology. While Doom itself did not utilize GPUs — since the technology was still in its infancy — it set a precedent that better graphics could lead to more immersive experiences and, consequently, greater sales. This realization fueled a desire among gamers and developers alike for more visually complex games, setting the stage for successors like Quake to fully harness the power of GPU hardware.
This push for enhanced graphics in gaming indirectly catalyzed the development of GPUs, which are now instrumental in powering the algorithms behind modern artificial intelligence (AI) and large language models (LLMs). The technological advancements spurred by the gaming industry's quest for better graphics have thus laid the groundwork for the computational capabilities essential to today's AI research and applications.
However, as we embark on this journey into AI's potential, it is crucial to proceed with both optimism and caution. While AI promises to revolutionize our approach to global challenges such as climate change and healthcare, it also raises ethical concerns and safety issues. The complexity and potential biases within AI systems, as well as their impact on society, necessitate a balanced and informed approach to their development and deployment.
Despite these challenges, the story of Doom and its indirect contribution to the rise of GPUs and AI highlights an essential truth about innovation: it often occurs in unpredictable ways. Just as the gaming industry's demand for better graphics unexpectedly contributed to the AI revolution, today's AI technologies could provide us with unprecedented tools to address the pressing issues of our time.
The implications of technological cross-pollination are vast. LLMs and AI applications are already making significant strides in fields such as materials science and scientific research, showcasing the transformative potential of AI.
In light of Doom's legacy, the message is clear: while embracing the possibilities AI offers, we must navigate this new terrain with care and responsibility. Demystifying AI and promoting a culture of ethical innovation can help us leverage these technologies to their fullest potential without falling prey to their risks.
Doom's enduring impact extends beyond the realm of gaming, indirectly influencing the development of technologies that underpin the current AI landscape. As we explore AI's role in shaping our future, our decisions today will determine whether we can harness this potential to overcome our greatest challenges or whether we will be overwhelmed by the very technologies we hoped would save us. The creativity and exploratory spirit that Doom inspired could very well be the key to our collective salvation.