NWC! LFG. Will set that up with Mutiny tonight to see how it goes. Very nice work!
A few of them have gone through but hard to tell if it's being slow on the SN side or on the Mutiny side. Perhaps someone with alby or lnbits could test out too to see how it goes.
The comments are kind of painful to sit there and wait for. I wish it could just act like it went through but have some pending UI for it.
reply
It could b because we poll on the client for the invoice being paid which probably adds 500ms of latency on average.
Great feedback.
reply
Unfortunately doesn't work for both self-custodial NWC app and Alby hosted NWC
reply
It's meant to work. We probably have a bug.
What errors are you getting in your console?
reply
Uncaught (in promise) EvalError: Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive: "script-src 'self' 'wasm-unsafe-eval' 'inline-speculation-rules'".
at bakeCollection (content.7f229555.js:2064:98610) at Object.u (content.7f229555.js:2064:93234) at Object.g [as call] (content.7f229555.js:2064:93437) at es.o [as emit] (content.7f229555.js:2064:85238) at es.stop (content.7f229555.js:17:30157) at es.eoseReceived (content.7f229555.js:17:31755) at content.7f229555.js:17:8787 at Array.forEach (<anonymous>) at T.eoseReceived (content.7f229555.js:17:8754) at content.7f229555.js:17:11970
reply
@ekzyis looks like the nostr library violates CSP
reply
Mhh, interesting, I tested NWC when we launched and I could generate NWC requests 🤔
reply
I think it might be that the library uses wasm, some browsers probably don't support it, and we allow unsafe-eval-wasm but not unsafe-eval
reply
Oh, you'll have issues with graphene's browser and people running lockdown mode on iOS.
That's pretty crap that it's pulling in web assembly for whatever reason. There's really no need.
Ah. @saunter, which browser are you using?
hard to tell if it's being slow on the SN side or on the Mutiny side.
We poll every second if the invoice was paid, so some slowness is definitely on our side.
The comments are kind of painful to sit there and wait for. I wish it could just act like it went through but have some pending UI for it.
What do you mean? You should see toasts that show if payments are pending.
update: Oh, you mean the toasts in the right bottom corner with comments? no, Tony didn't.
Perhaps someone with alby or lnbits could test out too to see how it goes.
Tested with LNbits: lnbits_zap.mp4
reply
Comment box just sits there if a payment was needed for a reply
reply
Ah, I see. You literally meant "comments" with comments, haha. Makes sense!
reply
He means to just post it optimistically, storing it on backend like a "draft" until it succeeds, which is closer to the ideal (we "absorb" the wait for them). For "failed drafts," we'll probably want to send them a notification. For "successful drafts" we'd do nothing and yay.
reply
937 sats \ 0 replies \ @nout 14 Feb
Yay for optimistic UX. Great latency saver...
reply
but have some pending UI for it.
We could style the text with text-muted while the payment is pending
reply