Skip to content

Conversation

@SourceryAI
Copy link

Thanks for starring sourcery-ai/sourcery ✨ 🌟 ✨

Here's your pull request refactoring your most popular Python repo.

If you want Sourcery to refactor all your Python repos and incoming pull requests install our bot.

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch https://github.com/sourcery-ai-bot/EmbeddingNet master
git merge --ff-only FETCH_HEAD
git reset HEAD^

Comment on lines -6 to +85
augmentations = A.Compose([
A.RandomBrightnessContrast(p=0.4),
A.RandomGamma(p=0.4),
A.HueSaturationValue(hue_shift_limit=20,
sat_shift_limit=30, val_shift_limit=30, p=0.4),
A.CLAHE(p=0.4),
A.Blur(blur_limit=1, p=0.3),
A.GaussNoise(var_limit=(50, 80), p=0.3)
], p=1)
return A.Compose(
[
A.RandomBrightnessContrast(p=0.4),
A.RandomGamma(p=0.4),
A.HueSaturationValue(
hue_shift_limit=20,
sat_shift_limit=30,
val_shift_limit=30,
p=0.4,
),
A.CLAHE(p=0.4),
A.Blur(blur_limit=1, p=0.3),
A.GaussNoise(var_limit=(50, 80), p=0.3),
],
p=1,
)

elif name == 'plates':
augmentations = A.Compose([
A.RandomBrightnessContrast(p=0.4),
A.RandomGamma(p=0.4),
A.HueSaturationValue(hue_shift_limit=20,
sat_shift_limit=30,
val_shift_limit=30,
p=0.4),
A.CLAHE(p=0.4),
A.HorizontalFlip(p=0.5),
A.VerticalFlip(p=0.5),
A.Blur(blur_limit=1, p=0.3),
A.GaussNoise(var_limit=(50, 80), p=0.3),
A.RandomCrop(p=0.8, height=2*input_shape[1]/3, width=2*input_shape[0]/3)
], p=1)
return A.Compose(
[
A.RandomBrightnessContrast(p=0.4),
A.RandomGamma(p=0.4),
A.HueSaturationValue(
hue_shift_limit=20,
sat_shift_limit=30,
val_shift_limit=30,
p=0.4,
),
A.CLAHE(p=0.4),
A.HorizontalFlip(p=0.5),
A.VerticalFlip(p=0.5),
A.Blur(blur_limit=1, p=0.3),
A.GaussNoise(var_limit=(50, 80), p=0.3),
A.RandomCrop(
p=0.8,
height=2 * input_shape[1] / 3,
width=2 * input_shape[0] / 3,
),
],
p=1,
)

elif name == 'deepfake':
augmentations = A.Compose([
A.HorizontalFlip(p=0.5),
], p=1)
return A.Compose(
[
A.HorizontalFlip(p=0.5),
],
p=1,
)

elif name == 'plates2':
augmentations = A.Compose([
A.CLAHE(clip_limit=(1,4),p=0.3),
A.HorizontalFlip(p=0.5),
A.VerticalFlip(p=0.5),
A.RandomBrightness(limit=0.2, p=0.3),
A.RandomContrast(limit=0.2, p=0.3),
# A.Rotate(limit=360, p=0.9),
A.RandomRotate90(p=0.3),
A.HueSaturationValue(hue_shift_limit=(-50,50),
sat_shift_limit=(-15,15),
val_shift_limit=(-15,15),
p=0.5),
# A.Blur(blur_limit=(5,7), p=0.3),
A.GaussNoise(var_limit=(10, 50), p=0.3),
A.CenterCrop(p=1, height=2*input_shape[1]//3, width=2*input_shape[0]//3),
A.Resize(p=1, height=input_shape[1], width=input_shape[0])
], p=1)
else:
augmentations = None
return A.Compose(
[
A.CLAHE(clip_limit=(1, 4), p=0.3),
A.HorizontalFlip(p=0.5),
A.VerticalFlip(p=0.5),
A.RandomBrightness(limit=0.2, p=0.3),
A.RandomContrast(limit=0.2, p=0.3),
# A.Rotate(limit=360, p=0.9),
A.RandomRotate90(p=0.3),
A.HueSaturationValue(
hue_shift_limit=(-50, 50),
sat_shift_limit=(-15, 15),
val_shift_limit=(-15, 15),
p=0.5,
),
# # A.Blur(blur_limit=(5,7), p=0.3),
A.GaussNoise(var_limit=(10, 50), p=0.3),
A.CenterCrop(
p=1,
height=2 * input_shape[1] // 3,
width=2 * input_shape[0] // 3,
),
A.Resize(p=1, height=input_shape[1], width=input_shape[0]),
],
p=1,
)

return augmentations
else:
return None
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function get_aug refactored with the following changes:

This removes the following comments ( why? ):

#             A.Blur(blur_limit=(5,7), p=0.3),

Comment on lines -29 to +34

if train_csv_file is not None:
self.class_files_paths = self._load_from_dataframe(train_csv_file, image_id_column, label_column, is_google)
else:
self.class_files_paths = self._load_from_directory()

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found the following improvement in Function ENDataLoader.__init__:

subdirs = [f.path for f in os.scandir(class_dir_path) if f.is_dir()]
temp_list = []
if len(subdirs)>0:
if subdirs:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ENDataLoader._load_from_directory refactored with the following changes:

return self.n_batches_val
else:
return self.n_batches
return self.n_batches_val if self.val_gen else self.n_batches
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function ENDataGenerator.__len__ refactored with the following changes:

Comment on lines -210 to +207

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function TripletsDataGenerator.get_batch_triplets_mining refactored with the following changes:

Comment on lines -100 to +102
ax.set(xlabel='epoch', ylabel='{}'.format(k),
title='{}'.format(k))
ax.set(xlabel='epoch', ylabel=f'{k}', title=f'{k}')
ax.grid()

fig.savefig("{}{}.png".format(save_path, k))
fig.savefig(f"{save_path}{k}.png")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function plot_grapths refactored with the following changes:

Comment on lines -113 to +111

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found the following improvement in Function plot_batch_simple:

Comment on lines -145 to +150
optimizer = optimizers.Adam(lr=learning_rate)
return optimizers.Adam(lr=learning_rate)
elif name == 'rms_prop':
optimizer = optimizers.RMSprop(lr=learning_rate)
return optimizers.RMSprop(lr=learning_rate)
elif name == 'radam':
from keras_radam import RAdam
optimizer = RAdam(learning_rate)
return RAdam(learning_rate)
else:
optimizer = optimizers.SGD(lr=learning_rate)
return optimizer
return optimizers.SGD(lr=learning_rate)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function get_optimizer refactored with the following changes:

Comment on lines -195 to +192

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found the following improvement in Function parse_params:


model_prediction = model.predict(image_path)
print('Model prediction: {}'.format(model_prediction))
print(f'Model prediction: {model_prediction}')
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 25-25 refactored with the following changes:

args = parser.parse_args()

return args
return parser.parse_args()
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function parse_args refactored with the following changes:

Comment on lines -67 to +71


initial_lr = params_train['learning_rate']
decay_factor = params_train['decay_factor']
step_size = params_train['step_size']

if params_dataloader['validate']:
callback_monitor = 'val_loss'
else:
callback_monitor = 'loss'

callback_monitor = 'val_loss' if params_dataloader['validate'] else 'loss'
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function main refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant