Is there any way to have transactions automatically numbered in hledger?
iiuc the swedish law says verifications should be numbered chronologically. I do it manually but maybe there is a way to have hledger to it?
With print -O csv
you can see there’s an internal numbering, though it’s not stable. There’s no easy way to make those numbers a permanent part of the data. You’d have to write a script I think.
I saw this comment and I was curious. I found this explanation about verification numbering. Verifications - verksamt.se.
Edit: I apologize for some errors below. I have further read the about the SIE file format at Microsoft Word - SIE_filformat_ver_4B_ENGLISH.doc. I now see that in this file, #VER (verification items) correspond to PTA/hledger transactions, and #TRANS (transaction items) are #VER subitems that correspond to PTA/hledger postings. Ignored my next two paragraphs.
It looks like verifications are what might be referred to as source documents - the invoices, statements, etc that are the source of the data abstracted into the journal. The verification numbers are actually assigned to the source document and then that verification number is recorded in the transaction as a reference back to the source document (presumably for audit purposes).
I suggest that you could use tags for this purpose, as the verification numbers are generated before the transaction in the journal is created.
I also read that that there are some requirements forbidding software that could alter transactions after they are entered, specifically saying you can't use Excel. Since this would seem to imply that a journal file can only grow by appending, this may be a limitation on plain text accounting in Sweden.
Interesting.
Here is an item in a a sie file, increasing owners equity:
#VER B 1 20240102 "Egen ins" 20240109
{
#TRANS 1930 {} 1000.00 20240105 "Egen ins"
#TRANS 2018 {} -1000.00 20240105 "Egen ins"
}
(where 1930 is the bank account and 2018 something with equity)
so besides a number, there should be a date for the transaction and another date for when it is entered.
I also read that that there are some requirements forbidding software that could alter transactions after they are entered, specifically saying you can't use Excel.
I have also come across this. I think it seems silly as I am sure you can alter verifications after they have been entered in "allowed" book keeping programs, it just takes more know how and work.
Edit: for example it is ok to use pen and paper. And I gather it is possible to export to sie, edit the sie then start over and import it into the program. So as I interpret it, it is the "chain" that should make sense, it should be traceable and not editable. Instead of modifying an old transaction, you first enter a new one that resets the old one, then the correct one.
I think it should be possible to come up with a solution using hashes. Maybe as easy as including a hash of the previous transaction as a comment?
This domain is the reason why there are journal data auditing features in Tackler.
The journal data could be read directly from git, and the commit id (e.g. cryptographic proof of used journal data) is included into reports.
Transactions have own unique id, so when the commit id is combined with transaction id hash,
there is chain of proof which transactions were used to produce the reports, and there is proof of the pristine version of the audit data which was used.
Tackler repository has Journal Audit example which is demonstrating these capabilities:
All transactions up to May during year 2016:
tackler --config audit.toml --input.git.ref s1e3_2016-04
Result:
Git Storage
reference : s1e3_2016-04
directory : txns
extension : txn
commit : d5e6d6959676682d563081dd240b45c8e6a5a282
author : tackler <accounting@example.com>
date : 2016-04-30 18:41:00 +0000
subject : set-1e3: 2016-04-30
Txn Set Checksum
SHA-256 : 9115c315aabb1be4ec105496a9eab8523746380ca1ab0bfebf7a0166b65bf51e
set size : 331
**********************************************************************************
Account Selector Checksum
SHA-256 : df4714ff3f5bd031a8256df95863a8085f740b436f1b47febb45c9eb9aaa9e9e
selector : 'a:.*'
Balance Report
--------------
-1350.00 a:ay2016:am01
-1175.00 a:ay2016:am02
-1352.00 a:ay2016:am03
-1267.00 a:ay2016:am04
======================
-5144.00
##################################################################################
Same thing for all transactions up to June during year 2016:
tackler --config audit.toml --input.git.ref s1e3_2016-05
Git Storage
reference : s1e3_2016-05
directory : txns
extension : txn
commit : 64187b6fe259ab40fefe91661b87a6608a83aa24
author : tackler <accounting@example.com>
date : 2016-05-31 21:18:50 +0000
subject : set-1e3: 2016-05-31
Txn Set Checksum
SHA-256 : 14243586cb94c812bee349632c0a66823c293bcc5201d694605c98bc405f7155
set size : 416
**********************************************************************************
Account Selector Checksum
SHA-256 : df4714ff3f5bd031a8256df95863a8085f740b436f1b47febb45c9eb9aaa9e9e
selector : 'a:.*'
Balance Report
--------------
-1350.00 a:ay2016:am01
-1175.00 a:ay2016:am02
-1352.00 a:ay2016:am03
-1267.00 a:ay2016:am04
-1362.00 a:ay2016:am05
======================
-6506.00
##################################################################################
You can run the same reports by yourself. By comparing Commit and Txn Set Checksum we can be sure that we are using the same set of unaltered journal data and same set of transactions.
P.S. In the original version of this post, there was a reference to a typo with tag names of the audit repository. Those are now fixed, and the post is edited accordingly.
Interesting. Never heard of tackler before but it seems quite advanced with a lot of features. And maybe a learning curve? I need some time to grasp the difference between file and git storage
With "git storage" tackler handles data via git?
Thanks!
The learning curve should be pretty flat with the basic features, hopefully!
There is tackler new
command which will produce ready-to-go and usable example journal, so testing is literally just running:
tackler new journal
tackler --config journal/conf/tackler.toml
Feel free to add, modify or delete transaction files under the journal/txns
directory.
The default template journal created by new
command is configured already so that you can use it directly with git, see Tackler's Git Primer, if you like to try it out.
The git storage is acting as virtual filesystem and when it's activated, journal is read directly from it. The git repository could be even "bare", there is no need to have working copy in case of pure git based storage.
However, it's possible to flip back and fort between file system and git storage by CLI argument, once you have run few git initialization commands outlined in the primer doc (those are just normal git init; git add; git commit stuff).
tackler --config journal/conf/tackler.toml --input.storage git
tackler --config journal/conf/tackler.toml --input.storage fs
This is really handy feature to run trial balances from working copy (file system) and official reports directly from git storage (either on your personal computer or on server after the changes has been pushed to the main repository).
This is really handy feature to run trial balances from working copy (file system) and official reports directly from git storage
Ok, then the penny dropped - I can see how that is useful.
I like how all the pta programs seem to complement each other. I take the opportunity to cast my vote for Interoperability!
On another note, I see pdf reports are mentioned. I guess they could be used for generating invoices, would be a useful feature.
Great way to start!
When I added my own journal (and edited accounts.toml) it wouldn't show anything until I capitalized (the first letter) of the accounts. Is capitalized accounts a requirement?
Hi, that's good feedback, thanks!
Capitalizing isn't needed, the names could be pretty much anything.
There is report.accounts
which is selecting which accounts to be used for reports, and the regex is case sensitive.
It's these lines in the config
accounts = [ "Assets(:.*)?", "Expenses(:.*)?" ]
...
register = { title = "Register Report", accounts = [ "Welcome(:.*)?", ]}
Try to change it suitable for your Chart of Accounts, or you can try --accounts ".*"
command line option, which will list everything.
There is also accounts
for each report type (balance, register, balance-group).
There is
report.accounts
which is selecting which accounts to be used for reports, and the regex is case sensitive.
Ok!
Maybe a good idea to have the "starter template" show all accounts?
I have been looking at Typst and how to make an invoice. Have an idea to read static info like name, address etc from a toml file. And read the info for amount, description etc from a json. How do you go about exporting that? Does it make sense and is possible to only export the last tx (for the invoice)?
Have an idea to read static info like name, address etc from a toml file. And read the info for amount, description etc from a json.
That sounds like good plan for prototype, at least. And probably the production setup could be something like that. Those static info's are the things people like to customize the most, among the template (visual and information content). So it would be good to have solid story how to provide an easy way to do that for the users. And the triplet of template, basic info as toml and actual report data as json sounds good.
How do you go about exporting that?
The question above is related to exporting JSON, isn't it?
In the template config, there is setting about formats - add "json" there, like that:
formats = [ "txt", "json" ]
After that, when you do file based reports, it will write json
alonside of txt
:
tackler --config journal/conf/tackler.toml \
--output.dir . \
--output.prefix reports
Balance Report (TEXT): ./reports.bal.txt
Balance Report (JSON): ./reports.bal.json
Register Report (TEXT): ./reports.reg.txt
Register Report (JSON): ./reports.reg.json
If you don't need text based reports on file, then it could be just "json"
.
Generating the Invoice
Does it make sense and is possible to only export the last tx (for the invoice)?
There are couple of options here:
First of tackler is operating of stream of transaction, and that stream can be filtered as you like. So it's possible to generate balance for only one week, month, etc. So you could use that to generate the content for the invoice, and maybe even translate e.g. time-and-materials to monetary value with pricedb.
The second option is to generate "the invoice" transaction, and only report that. To report only that single txn or few related txns for invoice, one option could be to use tags in the transactions and filter them based on that. So only those txns with "Invoice-May-2025" will be used.
There is that Trimix Filling Station example, which uses billing transactions, but for the simplicity of the example, it doesn't use any tags, etc.
With transaction filters you can do really odd stuff relatively easily. The filter is just plain JSON, and it can be generated by any means. There is an option to do base64
ascii armor, so it's easy to use complex filter definition with shell scripts.
In above links, there are examples of of time based filters, some of them are using natural language as part of description (courtesy from date
).
Epilogue
It would be really nice and appreciated if you would be interested in to work with the PDF templates. It's also something which would be beneficial for other tools too (e.g. hledger), as most of them have JSON output.
But this is getting really off topic for the original post, so maybe we should have own thread for that. There is Matrix channel for tackler, and there is also Zulip chat which is really great way to coordinate and discuss actual technical plans and implementation. Let me know if you would like join zulip, especially if you would like to work with the PDF reporting.
Ok, let's continue elsewhere. How do I join Zulip?
I sent a PM invitation to you.