instaparse

If you're not trampolining your parser, why bother getting up in the morning?
lucasbradstreet 2016-01-13T01:11:19.000040Z

Hi @jamesnvc, cljs perf is definitely a bit slow. You need to be using advanced mode compile otherwise it’s incredibly slow.

lucasbradstreet 2016-01-13T01:11:24.000041Z

I’m the guy who did the port

2016-01-13T01:12:00.000042Z

@lucasbradstreet: Cool, thanks :simple_smile: Good to know I’m (maybe) not just doing something crazy

lucasbradstreet 2016-01-13T01:12:14.000043Z

Depending on what you’re doing, you can also serialise the parser definition and load it in directly

lucasbradstreet 2016-01-13T01:12:38.000044Z

Creating the initial parser in cljs can take quite a while, but the parsing itself can be pretty acceptable

meow 2016-01-13T01:19:03.000045Z

and my money was on crazy @jamesnvc

meow 2016-01-13T01:20:18.000046Z

can any of that "job" be split between client and server?

meow 2016-01-13T01:20:28.000047Z

say for a chat app

meow 2016-01-13T01:21:18.000048Z

just send the user keystrokes to the server - do it in clj there

meow 2016-01-13T01:21:30.000049Z

just brainstorming

meow 2016-01-13T01:21:34.000050Z

outloud

meow 2016-01-13T01:23:45.000051Z

doesn't each keystroke go to the server already - that's how you can display the fact that the user is typing

meow 2016-01-13T01:24:07.000052Z

so don't do any processing on the client - do the instaparse on the server

meow 2016-01-13T01:24:34.000053Z

and use yada or onyx or something to scale it

meow 2016-01-13T01:24:59.000054Z

we can segregrate services on the server and compose them

meow 2016-01-13T01:25:48.000055Z

compose microservices on the server and keep the client relatively stupid whenever the data is already on the server

lucasbradstreet 2016-01-13T01:36:27.000056Z

I was parsing excel formulas on the client and it was good enough

lucasbradstreet 2016-01-13T01:36:41.000057Z

Certainly faster than a round-trip to the server

meow 2016-01-13T01:41:47.000058Z

ok

meow 2016-01-13T01:42:06.000059Z

I'll defer to @jamesnvc since I'm just blowing smoke

lucasbradstreet 2016-01-13T01:45:25.000060Z

My overall experience was that creating the initial parser was very expensive, but overall parsing was OK, but that it had to be in advanced mode. All with a big chunk of “your mileage may vary”. Unfortunately I don’t have any time to work on performance any further

2016-01-13T11:08:00.000061Z

Cool, I was thinking of splitting it between client and server, but I will give it a shot with advanced compliation too

lucasbradstreet 2016-01-13T11:15:15.000062Z

Also works. I’d measure how long it takes to do the individual parses, not just page load time - because that will be affected by creating the initial parser

meow 2016-01-13T13:20:41.000063Z

@jamesnvc: we should test both and not make assumptions either way, imnsho

meow 2016-01-13T13:21:01.000064Z

@lucasbradstreet: thanks for the help and suggestions - much appreciated

lucasbradstreet 2016-01-13T13:23:42.000065Z

Agreed. Though you have to assume some variability in request latency when testing the other method, which is why I ultimately went with the client side approach. That said, you can have slow CPU clients too.

meow 2016-01-13T13:26:35.000066Z

then we should simulate issues with both environments and various combinations/permutations

meow 2016-01-13T13:26:50.000067Z

ask the #C0J20813K team how good I am at doing that

meow 2016-01-13T13:27:06.000068Z

issues, oh yeah, I got issues

lucasbradstreet 2016-01-13T14:48:04.000069Z

ha :thumbsup:

meow 2016-01-13T15:06:27.000070Z

:simple_smile: