-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make block size limit part of the BlockEncoder API #223
Comments
There is some overlap with #222, however this is much smaller in scope and does no attempt to reduce computational overhead. |
I think I'm fine with this but a little dubious of this as the right way to achieve the stated goal because it requires you allocating the maximum size byte array for each encode, which may end up being very wasteful if your encode regularly gives you only a fraction of that size. At least now our codecs (mostly?) only request allocation of what they actually need (or close to it, there's some trickery involved but it's not a very large optimistic allocation). The change would also be fairly invasive if you want to go changing codecs. dag-pb would probably the easiest to change to start with but it's still pretty invasive (off the top of my head). So if you want to experiment with the API and have time then go ahead and see how it works out! |
This does not match my experience. We find ourselves allocating fixed size buffers (frame of sorts) that we intended to pack with some data and send it off, then repeat until we're done. With current APIs we have two choices:
With proposed API we no longer have to choose between either of two, instead we can just allocate frame and encode blocks directly into it. That does mean that for each block we'll create a new Uint8Array view into buffer from the current offset with max block length, but those views will be short lived and are a lot cheaper. |
Oh, right, so you're allocating a large chunk and then wanting to present a slice to the decoder to fill; I was imagining allocating a new large chunk for each one but that's unnecessary with Uint8Arrays if you have a nice queue lined up. That's fair enough. |
2023-01-03 IPLD triage conversation: @Gozala are you going to take this on? |
Currently
BlockEncoder
interface just encodes passed input into bytesjs-multiformats/src/codecs/interface.ts
Lines 6 to 10 in 58117f2
Problem is:
Proposed solution:
I would like to propose amending our
BlockEncoder
interface as follows:Idea here is that:
This will be non-breaking change at the API level, but it would be breaking in the sense that errors will occur if block is larger than a block size limit. Never the less it seems like a better default than silently letting things slip.
The text was updated successfully, but these errors were encountered: