Slow loading after using LingQ for years

Hello,
I’ve been using LingQ for a couple of years now and really enjoy using the program.
However, I’m running into a problem now that my word counts have started to stack up. (I think)
In the beginning LingQ was very snappy when I did any thing. Loading lessons, looking up words and browsing courses.

However it’s been getting slower and slower, and now it takes around 1 - 5 minutes for me to load a lesson every time I go in and out. Same with looking up words, it takes a few seconds every time I hit a yellow word.

In general it’s become a pain to navigate when everything I hit takes 10 times what it used to.

I do not think it’s hardware related as it’s on both my PC, Tablet, and Phone. And all 3 are very powerful and new.

I’m not sure what to do about this problem, but if It continues I will have to stop using LingQ as it’s slows down my reading a lot.

I would still love to use LingQ for my reading.
So if anyone can help me find out what’s causing the delays that’d be fantastic!

4 Likes

Hi, I feel you. This problem is real and I believe your analysis is accurate. The more you use LingQ, the slower it gets.
Not long ago, I have written to support about this very problem and included detailed statistics. LingQ have acknowledged the problem. If, or when a solution might be found is unclear, I am not optimistic because the underlying issue seems to be rather fundamental and not a simple (performance) optimization problem.

My completely superfluous armchair analysis of how I believe LingQ works follows:
LingQ holds a database for each language you have added on the server side. It contains entries for all your “known words”, “ignored words”, “LingQs”, their levels, translations in all your dictionary languages, notes, etc.
When you open a lesson a parsing process is started on the server, it checks all words in the lesson against all entries in your database. This process is what is actually taking the time. Once all words in the lesson have been matched (“known”, “ignored”, “LingQ” etc), the end result is sent back to the browser / app. This is not a bandwidth problem as the data seems to be in the single digit kilobyte range.

This performance problem is apparently also the reason for the current 2000 word limit imposed on lessons. Over the years LingQ has tightened this requirement, the last time with the introduction of LingQ 5, I previously used to upload 4000 word lessons.

Here are some of my numbers:
Chinese (Simplified), after about 435 days of use at 110k words (known+LingQs), a best case scenario (2000 word lesson, no audio) takes at least 40 seconds to open. The performance seems to degrade almost linearly so older lessons (4000 words) take a lot longer, lessons with audio take much longer because the audio download is only started after the parsed data has been submitted to the browser / app.
In Chinese (Traditional) I get away with a bit over 20s. With roughly comparable numbers for known words and LingQs, where does the disparity come from? I believe it is the number of ignored words. When I started on LingQ I was unaware of the harmful effects the ignore word function might cause in the long run and used it liberally. I didn’t repeat the mistake on Chinese Traditional. Long story short, I have shifted most of my reading to the Traditional side of things. 40+ seconds are just a bit uncomfortable, but I’m also not known for my patience :slight_smile:

My suggestion to LingQ was to let users delete their “ignored words” column from their database, this wouldn’t cause any harm and provide an instant performance boost, obviously without solving the underlying problem.

7 Likes

Initial load is definitely slow and I think bamboozled’s idea of what it’s doing behind the scenes sounds pretty accurate.

I seem to have different load times. Generally I’d say for me it is fairly tolerable. I just ran on the app on home wifi, even lessons I’ve not opened before opened in 6-8 seconds, but I’ve definitely seen on non wi-fi it taking what feels like 15-30 seconds typically. In some cases much more, but for my experience that’s been very rare.

I feel though, that the longer load upfront has resulted in a much snappier experience while going through the lesson. Once it’s up, clicking on words and getting results is much faster than it was in 4.0, so I feel like the tradeoff is very much worth it.

Hopefully they can continue to make improvements on that front end load up. That would be great.

As for lesson size. I think we’ve talked about this before, but even in lingq 4.0 the imported ebook lesson size was always around 2200 words. I’ve never seen it beyond that for ebook imports. Possibly I had other imports from the web or elsewhere that went beyond that, but I don’t think I noticed, so possibly there were some. I don’t have experience prior to 4.0.

2 Likes

I’m not sure about this explanation. Unless the software on the server is doing something really silly (of course it might be), the time it takes to look up words in the various lists should not depend so strongly on how many words there are. I assume it is not just scanning the list from top to bottom looking for the word. I assume…

3 Likes

Of course I’m only guessing about what happens over at LingQ. What I notice is that the server takes a long while to come up the lesson text after a request has been made. Also, Mark Kaufmann has given some hints, indicating that the number of words in the user database might be a reason for slowdowns:

2 Likes

While you’re right that the search algorithm probably isn’t just going from the top to the bottom, there’s also only so much that can be done to improve string matching like that. Even the best string matching algorithm will still degrade linearly as the number of entries to scan increases.

1 Like

The more data you have on your account, to longer it takes to load lessons. We are looking into this and we will do our best to have it improved very soon.

3 Likes

Interesting. Any idea how big your unknown word list is?

Do you just not click on ‘finish lesson’ or something to avoid adding all the ignored words? Due to the unfortunate, mandatory feature of ‘finish lesson moves words to known’, I’ve just been ignoring all the words. Maybe with enough encouragement, LingQ will make this feature ‘finish lesson moves words to known’ as optional.

1 Like

Only LingQ know. I’ve read 4.7 million words, if we conservatively assume 1/2%, i.e. 10 words per lesson the number would be <20k.

There are a few problems that lead to this situation:
1 I used to import content that had traditional characters mixed in. Instead of converting the text, I ignored the stray words.
2 For whatever reason LingQ considers non-character symbols “lingqable” in Chinese, for example words and letters in the Latin alphabet. All of those have to be ignored. English words are very common in certain domains, like finance and politics (examples: EFT, GDP, NATO), names are typically written in Chinese, the actual name will follow in braces: 英女王伊丽莎白二世(Queen Elizabeth II). Just look at the number of non-character words in a random news article: 美参院外委会高票通过《2022年台湾政策法案》 — 普通话主页
3 The word splitting is imperfect. It’s not terrible but it produces around 1-5% nonsense words in every lesson, again especially surrounding names. This used to be worse in LingQ 4.

Leaving words blue could help, but I also enjoy paging to known, especially when reading on mobile because this way I don’t have to touch every single word. If LingQ could ignore anything in the Latin alphabet it would already be a huge improvement, it should treat them like numbers (123).

The issue in this thread, by the way, seems to have improved significantly, I haven’t done any measurements but the loading times today (read 6 lessons) were very reasonable, I would guess sub 10s. Which is down from 40s. As it stands now my issues seem to have been resolved. Thanks to all involved. I hope people with larger word counts see similar improvements.

3 Likes

Yes, we improved things significantly after our last server maintenance yesterday. Glad to hear you noticed it! :slight_smile:

4 Likes

Also noticed it got significantly faster! Great work. :slight_smile:
Still a lot slower than when in the beginning. But as many have said, it’s impossible to avoid with string data and list search algorithms.

1 Like