When reading an invalid multibyte data sequence while set to an euc locale, e.g., ja_JP.eucJP, the src/lib/libc/locale/euc.c code will modify the bytes read to ensure that the 0x8080 or 0x808080 bits are set. This has the effect of silently returning data other than that which was in the input. There is then no way of detecting that the input sequence was invalid. The correct behavior is to test that those bits are set, return the data if they are, but return EILSEQ if not. Fix is applicable to 10-current and 9-stable. Please MFC. How-To-Repeat: 1. Create test file containing invalid euc multibyte characters such as: 0xa440 0xac4f 0xb36f 0xcf20 2. Set locale to, e.g., ja_JP.eucJP. 3. Read characters from file using getwc(). Observe that what's read is: 0xa4c0 0xaccf 0xb3ef 0xcfa0
The change looks fine to me, but I'd like the opinion of someone familiar with the character set in question.
batch change: For bugs that match the following - Status Is In progress AND - Untouched since 2018-01-01. AND - Affects Base System OR Documentation DO: Reset to open status. Note: I did a quick pass but if you are getting this email it might be worthwhile to double check to see if this bug ought to be closed.
lib/libc/locale/euc.c was revamped in base r286459, and the problem is no longer there.