Conversation
|
These look awesome! I think this is a great solution for generating screenshots. Thank you for working on this! |
|
Coming back to this, I reorganized it a bit and I think it's done enough to be used. For now I'm using a fair amount of hard-coded UI text to get elements instead of accessibilityIdentifier or accessibilityLabel; that's more vulnerable to small UI changes, but since these 'tests' are only going to be run occasionally I think it's ok to be more fragile in exchange for not adding a bunch of identifiers that only exist for these screenshot tests. The better overall thing (I assume) would be to use accessibilityLabel much more, and then you get both better accessibility and more robust UI tests, but that's outside the scope of making screenshots. |


This adds a UI testing target for use with fastlane's snapshot functionality, with a test that takes App Store screenshots on phone and tablet. There's also a basic script that masks and frames the images with imagemagick (this is apparently something fastlane can do as well, but it doesn't seem to support the latest devices). The script could be extended to also superimpose the screenshots on a nice background and fully automate the whole App Store image process pretty easily.
Take iOS screenshots with:
Take tvOS screenshots with:
And process them with
process.shNote: when running the iOS tests after the tvOS tests that seems to cause some weird issue where it tries to load the tvOS app on the iOS simulator, and the only way I can get that to stop happening is to do some combination of cleaning the build folder, deleting the UI test runner and/or the app from the simulator, closing the simulator, and deleting the Swiftfin folder in
Derived Data.Example framed screenshots: