mirror of
https://github.com/AsahiLinux/u-boot
synced 2024-12-01 00:49:43 +00:00
290ffe5788
I was trying to employ lpddr4_mr_read() to something similar to what the imx8mm-cl-iot-gate board is doing for auto-detecting the RAM type. However, the version in drivers/ddr/imx/imx8m/ddrphy_utils.c differs from the private one used by that board in how it extracts the byte value, and I was only getting zeroes. Adding a bit of debug printf'ing gives me tmp = 0x00ffff00 tmp = 0x00070700 tmp = 0x00000000 tmp = 0x00101000 and indeed I was expecting a (combined) value of 0xff070010 (0xff being Manufacturer ID for Micron). I can't find any documentation that says how the values are supposed to be read, but clearly the iot-gate definition is the right one, both for its use case as well as my imx8mp-based board. So lift the private definition of lpddr4_mr_read() from the imx8mm-cl-iot-gate board code to ddrphy_utils.c, and add a declaration in the ddr.h header where e.g. get_trained_CDD() is already declared. This has only been compile-tested for the imx8mm-cl-iot-gate board (since I don't have the hardware), but since I've merely moved its definition of lpddr4_mr_read(), I'd be surprised if it changed anything for that board. Signed-off-by: Rasmus Villemoes <rasmus.villemoes@prevas.dk> Tested-by: Ying-Chun Liu (PaulLiu) <paul.liu@linaro.org> Reviewed-by: Fabio Estevam <festevam@denx.de> |
||
---|---|---|
.. | ||
ddr_init.c | ||
ddrphy_csr.c | ||
ddrphy_train.c | ||
ddrphy_utils.c | ||
helper.c | ||
Kconfig | ||
Makefile |