A colleague wanted to learn fuzzing and they picked SNMP as the protocol they wanted to target. The plan was to find the oldest version of Net-SNMP we can find to fuzz so they’re more likely to find a crash. We checked out a release from 2000, but couldn’t get the configure script to work, then jumped to Net-SNMP v5.6.1, which is still a very old version of Net-SNMP from 2011. It didn’t compile immediately but I felt that it could be coaxed into compiling.
I managed to get this version to compile, but it wasn’t straightforward. The first thing we need to do is compile a version of OpenSSL that is from around 2011 as the API has changed enough to break things, and it was necessary to disable some features in Net-SNMP we didn’t need. I should note that although I have got this to compile and I can run “snmpget -v”, I have not tested this build in any meaningful way so I may not work.
The first thing to do is set a variable to hold the path of where we will install everything, this is used as the –prefix argument passed to autoconf.
PREFIX_DIR=$HOME/snmp/_output
OpenSSL
Let’s grab a copy of the OpenSSL source code and checkout a version from around 2011.
This time we have a more complicated configure setup. The first thing to note is we use “–with-openssl” to point autoconf at the OpenSSL we just built. The next thing to note is we disable Perl and Python moudules, this is because they also will not work with the newest version so those packages.
Next we can compile using make, this time we can use parallel build. We set LDFLAGS here to force the linker to chose the OpenSSL we built, and not the system version.
make -j$(nproc) LDFLAGS="-L${PREFIX_DIR}/lib"
make install
I came accross this TikTok video recently, where the presenter shows how to make an “Object Oriented” Hello World program. It made me laugh so much, I thought I would try my own hello world program.
This is my solution, which uses the Python ast module to inspect the script looking for functions to use in the hello world message.
#!/usr/bin/env python3
import ast
def Hello():
...
def World():
...
class Visitor(ast.NodeVisitor):
def __init__(self):
self.functions: list[str] = []
def visit_FunctionDef(self, node: ast.AST):
if '_' not in node.name and node.name[0].isupper():
self.functions.append(node.name)
self.generic_visit(node)
def _say_hi():
with open(__file__, 'r') as fd:
code = fd.read()
node = ast.parse(code)
visitor = Visitor()
visitor.visit(node)
print(' '.join(visitor.functions))
if __name__ == '__main__':
_say_hi()
GNU Emacs is a free and open-source text editor, it’s known for its extensibility and the ability to customize almost every aspect of its functionality through the use of Emacs Lisp code. I first started using it many years ago, around 2007 I think. After a hiatus of a few years, I have got back into using it as my daily driver.
There’s a new major release of GNU Emacs in development, and everyone is raving about it. It has got a lot of interesting new features like Eglot, and tree-sitter. In this blog post I will download the source code from Git, and compile to to try out the new features.
If you want to follow this blog post and get similar results, then I compiled the following Git commit: 63cdbd986bb8f841717e2d813df6f75b6b02cf8b. You can checkout this version with Git, but this is optional, you can just download the Emacs 29 release when it comes out and it should work the same. Just skip the “autogen” part as the release tarball should include the “configure” script.
We need to ensure we have all the dependencies that are needed to compile Emacs, I’m using Ubuntu 22.04 and have installed the following packages using APT.
The build system that GNU Emacs uses to build from source is called GNU Autotools. If you don’t knwo, then GNU Autotools is a set of tools that is used to build, install, and manage software packages on Unix-like systems. It consists of three main components: Autoconf, Automake, and Libtool. Autoconf is used to create portable configure scripts that can be used to set up a package’s build system. Automake is used to generate Makefiles that are used to build the package. Libtool is used to create portable libraries that can be used in multiple environments. Together, these tools help to automate the process of building, installing, and managing software packages, making it easier for developers to create software that can be easily compiled and installed on a wide range of systems.
As I’m compiling source I pulled from Git, there is not configure script, so the next step is to run Autogen to create the configure script.
$ ./autogen.sh
Nex we run the configure script, this will check the build environment to ensure all the necessary dependencies are present, and then create a Makefile we can run. I have used “–prefix” here to control where Emacs gets installed.
If the build environment is ok, the compiler is there, all the dependencies are there, ect.. Then, this is the summary of features you should see. If something is wrong then you will get errors from configure.
Configured for 'x86_64-pc-linux-gnu'.
Where should the build process find the source code? .
What compiler should emacs be built with? gcc -g3 -O2
Should Emacs use the GNU version of malloc? no
(The GNU allocators don't work with this system configuration.)
Should Emacs use a relocating allocator for buffers? no
Should Emacs use mmap(2) for buffer allocation? no
What window system should Emacs use? x11
What toolkit should Emacs use? GTK3
Where do we find X Windows header files? Standard dirs
Where do we find X Windows libraries? Standard dirs
Does Emacs use -lXaw3d? no
Does Emacs use -lXpm? yes
Does Emacs use -ljpeg? yes
Does Emacs use -ltiff? yes
Does Emacs use a gif library? yes -lgif
Does Emacs use a png library? yes -lpng16 -lz
Does Emacs use -lrsvg-2? yes
Does Emacs use -lwebp? no
Does Emacs use -lsqlite3? yes
Does Emacs use cairo? yes
Does Emacs use -llcms2? yes
Does Emacs use imagemagick? no
Does Emacs use native APIs for images? no
Does Emacs support sound? yes
Does Emacs use -lgpm? no
Does Emacs use -ldbus? yes
Does Emacs use -lgconf? no
Does Emacs use GSettings? yes
Does Emacs use a file notification library? yes -lglibc (inotify)
Does Emacs use access control lists? yes -lacl
Does Emacs use -lselinux? yes
Does Emacs use -lgnutls? yes
Does Emacs use -lxml2? yes
Does Emacs use -lfreetype? yes
Does Emacs use HarfBuzz? yes
Does Emacs use -lm17n-flt? no
Does Emacs use -lotf? no
Does Emacs use -lxft? no
Does Emacs use -lsystemd? yes
Does Emacs use -ljansson? yes
Does Emacs use -ltree-sitter? yes
Does Emacs use the GMP library? yes
Does Emacs directly use zlib? yes
Does Emacs have dynamic modules support? yes
Does Emacs use toolkit scroll bars? yes
Does Emacs support Xwidgets? no
Does Emacs have threading support in lisp? yes
Does Emacs support the portable dumper? yes
Does Emacs support legacy unexec dumping? no
Which dumping strategy does Emacs use? pdumper
Does Emacs have native lisp compiler? yes
Does Emacs use version 2 of the X Input Extension? yes
Does Emacs generate a smaller-size Japanese dictionary? no
Now the build environment is ready to build Emacs. Let’s run make to do the build. We pass “-j” to parallelise the build if multiple cores are available.
$ make -j$(nproc)
This step takes a long time, be patient. Once it’s finished we can run “make install” to install Emacs.
$ make install
Aswesome, so now we have GNU Emacs compiled, and installed so let’s try it out.
$ ~/.local/bin/emacs --init-directory=/tmp
This was our first opportunity to try out a new feature, the “–init-directory” argument. It allows us to control where Emacs looks for “.emacs.d”, by setting it to “/tmp” we prevent Emacs 29 from loading my configuration.
TLDR: use utf8mb4 as the character set for tables because utf8 is broken in MySQL.
Recently while attempting to load the Unihan character database into a MySQL database using Django, but I found that I was getting encoding errors. To cut a long story short, it turns out that in MySQL, the character encoding utf8 != utf8!
The long version of the story is that when creating the database, I had used the default “utf8” encoding, thinking that this would enable the full use of unicode. Unfortunately this is not the case, as in MySQL “utf8” does not fully implemnet UTF8.
The solution to this problem is to use the “utf8mb4” encoding instead.
CREATE DATABASE blog CHARACTER SET utf8mb4;
But this is not enough, you also need to inform Django to use utf8mb4 when connecting to MySQL. To do this add the following to Django database options
'OPTIONS': {'charset': 'utf8mb4'},
One more problem happened, I had set the “hanzi” field to be unique but then part way though loading in the data, the script returned a “duplicate entry” error for hanzi field (this was for the 𠀁 character). This is due to the collation settings for MySQL, which sets the rules MySQL uses for comparing characters.
The collation setting I needed is utf8mb4_bin, which compares the bytes of the character.
I did not want to change the collation setting for the whole database, as this could break other things. So I decided to just change that column. This means I needed to create a custom migration in Django. The first step is to create an empty migration.
python3 manage.py makemigrations --empty zhongwen
Then add the following code to the list of operations to run for that migration.
migrations.RunSQL(
'ALTER TABLE `zhongwen_hanzi` CHANGE `hanzi` `hanzi` VARCHAR(1) CHARACTER SET utf8mb4 COLLATE utf8mb4_bin NOT NULL;'
)
Then we can run the migration, and it will change the hanzi field to use utf8mb4_bin for the collation.
As crazy a it sounds, it turns out it is possible to solve Project Euler problem 1 using Windows Batch files! It is possible because modern versions for cmd support delayed expansion of variables, and arithmetic expressions using “SET /A”.
Delayed expansion means that variables are expanded at runtime, rather than parse time. Normally %VARIABLE% will be expanded once with whatever value it contains before the script is run, however with delayed expansion we can use !VARIABLE!. This variable is expanded whenever that line is executed, which with a for loop could be multiple times.
@echo off
setlocal enabledelayedexpansion
set /a sum = 0
FOR /L %%L IN (1, 1, 999) DO (
set /a mod = %%L %% 3
IF NOT !mod! EQU 0 set /a mod = %%L %% 5
IF !mod! EQU 0 set /a sum = sum + %%L
)
echo %sum%
endlocal
I recently needed to download a large number of files from a Linux server using SCP, which would take nearly 12 hours. When i checked the next day this transfer had stopped to ask if I wanted to overwrite a file. This was because two files had the same name, differing only in case.
I remembered there was a registry setting that could set the NTFS filesystem to be case sensitive[0], but I’d rather not do that on this workstation. I did a Google search anyway, but then I found there is a newer way to do this which is on a per-folder basis instead of globally[1].
There is a command called “fsutil.exe” that can adjust a lot of filesystem settings, one of them is “SetCaseSensitiveInfo”, which controls if the folder is case sensitive or not.
When I restarted the download, it still didn’t work, because that command only applies it to the specified folder, and any new folders created after. It does not apply to existing sub-folders. The following PowerShell command will apply the setting to any existing folder.
In my previous post, I described how to build an old version of GNU Make for Windows. While working on that I wanted to be able to test out different versions of Visual Studio to see if it builds successfully. Quickly switching between versions of Visual Studio was difficult so I created a batch file to help make it a lot easier.
This script takes a single argument that specifies which version of visual studio you want to set up and it then calls the appropriate vsvars32.bat file for that version of Visual Studio.
@echo off
if "%1" == "vs4" goto vs4
if "%1" == "vs6" goto vs6
if "%1" == "vs2003" goto vs2003
goto argerror
:vs4
pushd C:\msdev\bin
call vcvars32 x86
popd
goto done
:vs6
pushd C:\Program Files\Microsoft Visual Studio\VC98\Bin
call vcvars32
popd
goto done
:vs2003
pushd "C:\Program Files\Microsoft Visual Studio .NET 2003\Common7\Tools\"
call vsvars32.bat
popd
goto done
:argerror
echo no Visual Studio version specified!
echo usage: setupenv [version]
echo where version is one of the following: vs4, vs6, vs2003
:done
I needed to build GNU Make v3.8 for Windows, turns out that this is not straightforward and I needed to patch the build script to get it to correctly build. GNU Make 3.8 is a very old version of make, the release note dates back to 2002.
The first issue is it needs an old version of Visual C++, trying to build using VS 2019 you will get a lot of warnings about deprecated flags. Reading the README.W32 file it mentions MSVC 5.x and MSVC 6.x. I opted for MSVC 6 which I happened to have a copy of in a Windows 2000 VM.
The first issue we need to resolve is a linking error caused by a missing library.
Looking through the build output we can see that it cannot find "config.h", which prevents the "subproc.lib" library from compiling which subsequently causes the error we just saw.
C:\BUILD\xxx\make-3.80\w32\subproc>cl.exe /nologo /MT /W3 /GX /Z7 /YX /Od /I .. /I . /I ../include /I ../.. /D WIN32 /D
WINDOWS32 /D _DEBUG /D _WINDOWS /FR.\WinDebug/ /Fp.\WinDebug/subproc.pch /Fo.\WinDebug/ /c sub_proc.c
sub_proc.c
sub_proc.c(9) : fatal error C1083: Cannot open include file: 'config.h': No such file or directory
The reason that "config.h" does not exist is that the file is not created by the build script. The line in the build script that creates it doesn’t get run, this is because there is a “+” at the beginning of the line which stops this line from running.
set make=gnumake
+if not exist config.h copy config.h.W32 config.h
cd w32\subproc
Removing the plus at the beginning of the line allows "subproc.lib" to be compiled and linked, but we still get linker errors.
C:\BUILD\xxx\make-3.80>echo WinRel\pathstuff.obj 1>>link.rel
C:\BUILD\xxx\make-3.80>echo off
"Linking WinRel/gnumake.exe"
function.obj : error LNK2001: unresolved external symbol _hash_init
variable.obj : error LNK2001: unresolved external symbol _hash_init
file.obj : error LNK2001: unresolved external symbol _hash_init
dir.obj : error LNK2001: unresolved external symbol _hash_init
read.obj : error LNK2001: unresolved external symbol _hash_init
variable.obj : error LNK2001: unresolved external symbol _hash_insert_at
file.obj : error LNK2001: unresolved external symbol _hash_insert_at
dir.obj : error LNK2001: unresolved external symbol _hash_insert_at
read.obj : error LNK2001: unresolved external symbol _hash_insert_at
variable.obj : error LNK2001: unresolved external symbol _hash_deleted_item
file.obj : error LNK2001: unresolved external symbol _hash_deleted_item
dir.obj : error LNK2001: unresolved external symbol _hash_deleted_item
read.obj : error LNK2001: unresolved external symbol _hash_deleted_item
variable.obj : error LNK2001: unresolved external symbol _hash_find_slot
file.obj : error LNK2001: unresolved external symbol _hash_find_slot
dir.obj : error LNK2001: unresolved external symbol _hash_find_slot
read.obj : error LNK2001: unresolved external symbol _hash_find_slot
variable.obj : error LNK2001: unresolved external symbol _hash_find_item
file.obj : error LNK2001: unresolved external symbol _hash_find_item
dir.obj : error LNK2001: unresolved external symbol _hash_find_item
function.obj : error LNK2001: unresolved external symbol _hash_find_item
variable.obj : error LNK2001: unresolved external symbol _hash_free
read.obj : error LNK2001: unresolved external symbol _hash_free
function.obj : error LNK2001: unresolved external symbol _hash_free
variable.obj : error LNK2001: unresolved external symbol _hash_map
file.obj : error LNK2001: unresolved external symbol _hash_map
variable.obj : error LNK2001: unresolved external symbol _hash_delete
file.obj : error LNK2001: unresolved external symbol _hash_delete
variable.obj : error LNK2001: unresolved external symbol _hash_print_stats
file.obj : error LNK2001: unresolved external symbol _hash_print_stats
variable.obj : error LNK2001: unresolved external symbol _hash_map_arg
file.obj : error LNK2001: unresolved external symbol _hash_dump
dir.obj : error LNK2001: unresolved external symbol _hash_insert
function.obj : error LNK2001: unresolved external symbol _hash_insert
function.obj : error LNK2001: unresolved external symbol _hash_load
.\WinRel/gnumake.exe : fatal error LNK1120: 13 unresolved externals
"WinRel build failed"
C:\BUILD\xxx\make-3.80>
There’s a lot of unresolved symbols, I searched for "hash_insert_at" and found the definition for it in "hash.c". Looking through the build script it turns out that this file is not included in the build. I added the following two lines to "build_w32.bat" just after "implicit.c" is compiled.
C:\BUILD\build\make-3.80>echo off
"Linking WinRel/gnumake.exe"
LINK : warning LNK4089: all references to "ADVAPI32.dll" discarded by /OPT:REF
"WinRel build succeeded!"
C:\BUILD\build\make-3.80>
I’m using the Enlighter plugin for WordPress to syntax highlight code snippets in posts. When using the TwentyTwenty theme the code snippets are left-aligned instead of centered in the post. This is a known compatibility issue for this plugin, and there is a fix detailed on Github.
.enlighter-default{
margin: 0 auto 1.25em auto;
}
I’ve already created a child theme based on TwentyTwenty so it was easy to add the above CSS to the stylesheet, and et viola, the code snippets are correctly aligned now.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.