Hashnode Markdown Bulk Import Is Troublesome

· klm's blog


Original post is here: eklausmeier.goip.de

1. I have written more than 300 blog posts, see sitemap. I wanted to import them into Hashnode. As all these posts were already in Markdown format, it was obvious to use the "bulk import" feature in Hashnode. The Hashnode bulk import reads a zip-file. This zip-file contains individual Markdown files.

I had checked previously that this bulk import worked. Although, I had checked it with a zip-file with only a single Markdown file in it. I noticed that deleting a previous post makes the slug unusable for ever. I have written on this here: Deletion Troublesome in Hashnode.com. This behaviour occurs on import and on direct entry. It is therefore not an import-specific problem.

So I knew I could only import my posts once, and only once. Therefore I concatenated multiple posts into one file and checked the large post in Hashnode. I did this multiple times. With this procedure I figured out that MathJax support in Hashnode is quite "special". Also, many european characters, for example German umlaut characters, are not directly supported in Hashnode, they have to be replaced by their proper HTML equivalent.

The MathJax specialities and shortcomings are:

  1. Underscore in displaymath needs to be escaped with backslash.
  2. Double backslashes in displaymath need to be escaped with four backslashes and a newline.
  3. Negative whitespace (kerning) \! needs to be escaped with backslash.
  4. Star in displaymath needs to be escaped with backslash.
  5. When displaymath has lines starting with a minus or plus-sign, they need to be escaped with backslash.

The slug from my original blog posts had to be mapped to a schema without slashes, a.k.a. directories.

2. All these changes I incorporated into a Perl script saaze2hashnode, see below. Then I thought, importing would be easy. That turned out to be wrong. Instead, Hashnode is not able to import more than roughly 10 or 20 files. Also, each individual post has to be confirmed. If you have more than 300 posts, you have to confirm more than 300 posts! So I had to create multiple zip-files. Apparently every post returned with some error. From my roughly 300 posts, 4 didn't make it, even after trying multiple times. I got the impression that posts with references to images took longer than posts, which only contained text. It might be an issue with total size of all HTML elements which break some internal Hashnode boundaries. Below errors were experienced: This error does not give any hint, what the exact cause of the problem is. In all my posts I only have references to images. There is no "direct" image, i.e., I did not upload any image to Hashnode.

The Perl script saaze2hashnode is:

 1#!/bin/perl -W
 2# Convert Saaze Markdown to Hashnode Markdown
 3#   1. Insert reference to eklausmeier.goip.de at top of post
 4#   2. Change all href's (anchor and images) to eklausmeier.goip.de
 5#   3. YouTube and Twitter tags are converted to Hashnode embedly's format
 6#   4. Change LaTeX single dollar to \\( and \\)
 7#   5. Slug is blog-year-month-day-text.md
 8#
 9# Usage:
10#   ( let i=0; for f in `find . -name \*.md | sort`; do let i=i+1; saaze2hashnode $f | tee -a /tmp/hashnode/all1 > /tmp/hashnode/c$i; done )
11
12use strict;
13
14my ($n3dash,$nslug,$codeBlock,$displayMath) = (0,0,0,0);
15my ($year,$hashnode,$goIP,$title) = ("","","","");
16
17while (<>) {
18	if (/^---\s*/) {
19		++$n3dash;
20	} elsif ($n3dash == 1) {
21		if ( /^date:\s+"(\d\d\d\d)/ ) {
22			$year = $1;
23		} elsif (/^title:\s+("|)(.+)\s*$/) {
24			$title = $2;
25			$title =~ s/"$//;
26		} elsif ($nslug == 0  &&  length($year) == 4) {
27			my $fn = $ARGV;
28			$fn =~ s/\.md$//;
29			$fn = substr($fn,1 + rindex($ARGV,"/"));
30			$hashnode = "blog-" . $year . "-" . $fn;
31			$goIP = "blog/" . $year . "/" . $fn;
32			printf("slug: %s\n",$hashnode);
33			$nslug = 1;
34		}
35	} elsif ($n3dash == 2  &&  $nslug == 1) {
36		printf("\nThis post was automatically copied from [%s](https://eklausmeier.goip.de/%s) on [eklausmeier.goip.de](https://eklausmeier.goip.de/blog).\n\n",$title,$goIP);
37		$nslug = 2;
38	}
39
40	if (/^```/) {
41		s/^```(\w+)\s+.+/```$1/;	# strip anything behind programming language
42		$codeBlock = 1 - $codeBlock;
43	}
44	s/\[more_WP_Tag\]//;
45
46	if ($codeBlock == 0) {
47		s/\[youtube\]\s*(.+?)\s*\[\/youtube\]/%\[https:\/\/www.youtube.com\/watch?v=$1\]/g;
48		s/\(\.\.\/\.\.\/\.\.\/(img|pdf)/\(https:\/\/eklausmeier.goip.de\/$1/g;
49		s/\.\.\/\.\.\/2(\d\d\d)\//https:\/\/eklausmeier.goip.de\/blog\/2$1\//g;
50
51		s/Ä/&Auml;/g;
52		s/Ö/&Ouml;/g;
53		s/Ü/&Uuml;/g;
54		s/ä/&auml;/g;
55		s/ö/&ouml;/g;
56		s/ü/&uuml;/g;
57		s/ß/&szlig;/g;
58		s/ı/&#305;/g;
59		s/–/&ndash;/g;
60		s/—/&ndash;/g;
61		s/´/&rsquo;/g;
62		s/’/&rsquo;/g;
63		s/‘/&rsquo;/g;
64		s/“/&ldquo;/g;
65		s/”/&rdquo;/g;
66		# s/°C/&#8451;/g;
67		s/°/&deg;/g;
68		s/§/&sect;/g;
69		s/á/a&#769;/g;
70		s/é/e&#769;/g;
71
72		if (/^\$\$/) {
73			$displayMath = 1 - $displayMath;	# flip flag
74		} else {	# inline math
75			s/\$(.+?)\$/\\\\\($1\\\\\)/g;	# replace $xyz$ with \\(xyz\\)
76		}
77		if ($displayMath == 1) {
78			if (/^\$\$(.+)\$\$/) { s/\$\$(.+)\$\$/\$\$\n$1\n\$\$/; $displayMath = 0; }
79			s/_/\\_/g;	# Hashnode needs escaping of underscore
80			s/\\\\(\s*)/\\\\\\\\\n /g;	# replace \\ with \\\\{newline}{single space}
81			s/\\!/\\\\!/g;	# escaping \!
82			s/\\\{/\\\\\{/g;	# escaping {
83			s/\\}/\\\\}/g;	# escaping }
84			s/\*/\\*/g;	# Hashnode needs escaping of star
85			s/^(\s*)\-/$1\\-/;	# prevent Hashnode from taking minus as enumeration
86			s/^(\s*)\+/$1\\+/;	# prevent Hashnode from taking plus as enumeration
87		}
88	}
89
90	print;
91}

3. Those posts, which didn't make it through the importer, I tried to enter manually. I.e., I used saaze2hashnode script and then pasted the output into the browser window. Even then, Hashnode had trouble publishing them:

Above message showed for more than ten hours. After that I closed the window.

Obviously, the whole experience in importing and posting blog entries is not good.