我想模拟实用程序函数dynamoDBStatusWrite,这样当我的spark程序运行时,它就不会命中DynamoDB。
下面是我的模拟和测试案例
class FileConversion1Test extends FlatSpec with MockitoSugar with Matchers with ArgumentMatchersSugar with SparkSessionTestWrapper {
"File Conversion" should "convert the file to" in {
val utility = mock[Utilities1]
val client1 = mock[AmazonDynamoDB]
val dynamoDB1 =mock[DynamoDB]
val dynamoDBFunc = mock[Utilities1].dynamoDBStatusWrite("test","test","test","test")
val objUtilities1 = new Utilities1
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getDeclaredField("client"),client1)
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getDeclaredField("dynamoDB"),dynamoDB1)
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getField("dynamoDBStatusWrite"),dynamoDBFunc)
when(utility.dynamoDBStatusWrite("test","test","test","test")).thenReturn("pass")
assert(FileConversion1.fileConversionFunc(spark,"src/test/inputfiles/userdata1.csv","parquet","src/test/output","exec1234567","service123")==="passed")
}
}
我的spark程序不应该尝试连接dynamoDB。但正在尝试连接
1条答案
按热度按时间ilmyapht1#
这里有两个问题,首先,模拟某些内容不会在系统中自动替换它,您需要构建软件,以便注入组件,因此在测试中,您将提供它们的模拟版本。即,
fileConversionFunc
应通过Dynamo连接器接收另一个参数。也就是说,模拟库/第三方类被认为是一种不好的做法,你应该做的是创建自己的组件来封装与Dynamo的交互,然后模拟你的组件,因为它是你控制的API。