The following queries are taking 70 minutes and 1 minute respectively on a standard machine for 1 million records. What could be the possible reasons?
Query [01:10:00]
SELECT *
FROM cdc.fn_cdc_get_net_changes_dbo_PartitionTest(
CASE WHEN sys.fn_cdc_increment_lsn(0x00)<sys.fn_cdc_get_min_lsn( dbo_PartitionTest )
THEN sys.fn_cdc_get_min_lsn( dbo_PartitionTest )
ELSE sys.fn_cdc_increment_lsn(0x00) END
, sys.fn_cdc_get_max_lsn()
, all with mask )
WHERE __$operation <> 1
Modified Query [00:01:10]
DECLARE @MinLSN binary(10)
DECLARE @MaxLSN binary(10)
SELECT @MaxLSN= sys.fn_cdc_get_max_lsn()
SELECT @MinLSN=CASE WHEN sys.fn_cdc_increment_lsn(0x00)<sys.fn_cdc_get_min_lsn( dbo_PartitionTest )
THEN sys.fn_cdc_get_min_lsn( dbo_PartitionTest )
ELSE sys.fn_cdc_increment_lsn(0x00) END
SELECT *
FROM cdc.fn_cdc_get_net_changes_dbo_PartitionTest(
@MinLSN, @MaxLSN, all with mask ) WHERE __$operation <> 1
[Modified]
I tried to recreate the scenario with a similar function to see if the parameters are evaluated for each row.
CREATE FUNCTION Fn_Test(@a decimal)RETURNS TABLE
AS
RETURN
(
SELECT @a Parameter, Getdate() Dt, PartitionTest.*
FROM PartitionTest
);
SELECT * FROM Fn_Test(RAND(DATEPART(s,GETDATE())))
But I am getting the same value for the column Parameter for a a million records processed in 38 seconds.